827 resultados para data-based reporting
Resumo:
The agro-climatic conditions in western Kenya present the region as a food surplus area yet people are still reliant on food imports, with the region registering high poverty levels. Depletion of soil fertility and the resulting decline in agricultural productivity in Mbale division has led to many attempts to develop and popularize Integrated Soil Fertility Management (ISFM) technologies that could restore soil fertility. These technologies bridge the gap between high external inputs and extreme forms of traditional low external input agriculture. Some of the ISFM components used by farmers are organic and inorganic inputs and improved seeds. However, the adoption of these technologies is low. The study aimed to examine the factors that influence the adoption of ISFM technologies by smallholder farmers in Mbale division, Kenya. The study was conducted in 9 sub-locations in Mbale division. Purposive sampling was used in selecting the 80 farmers to get the data based on a farm-household survey. Self-administered questionnaires were used to collect data on the determinants of the adoption of ISFM technologies from the sampled farmers in the study area. The study sought to answer the research question: What factors influence the uptake of ISFM technologies by farmers in Mbale division? The hypothesis tested was that the adoption of ISFM technologies is not influenced by age, education, extension services, labour, off-farm income and farm size. Data was analyzed using descriptive statistics. Cross tabulation was used for examining the relationship between categorical (nominal or ordinal) variables, and the bivariate correlations procedure was used to compute the pair wise associations between scale or ordinal variables. Probit regression was used to predict the socio-economic factors influencing the adoption of ISFM technologies among smallholder farmers. Results of the study indicated that education of household head, membership in social groups, age of the household head, off-farm income and farm size were the variables that significantly influenced the adoption of ISFM technologies. The findings show that there is need for a more pro-poor focused approach to achieve sustainable soil fertility management among smallholder farmers. The findings will help farmers, extension officers, researchers and donors in identifying region-specific entry points that can help in developing innovative ISFM technologies.
Resumo:
Dissertação de Mestrado apresentada ao Instituto Superior de Psicologia Aplicada para obtenção de grau de Mestre na especialidade de Psicologia Clínica.
Resumo:
Traditionally, densities of newly built roadways are checked by direct sampling (cores) or by nuclear density gauge measurements. For roadway engineers, density of asphalt pavement surfaces is essential to determine pavement quality. Unfortunately, field measurements of density by direct sampling or by nuclear measurement are slow processes. Therefore, I have explored the use of rapidly-deployed ground penetrating radar (GPR) as an alternative means of determining pavement quality. The dielectric constant of pavement surface may be a substructure parameter that correlates with pavement density, and can be used as a proxy when density of asphalt is not known from nuclear or destructive methods. The dielectric constant of the asphalt can be determined using ground penetrating radar (GPR). In order to use GPR for evaluation of road surface quality, the relationship between dielectric constants of asphalt and their densities must be established. Field measurements of GPR were taken at four highway sites in Houghton and Keweenaw Counties, Michigan, where density values were also obtained using nuclear methods in the field. Laboratory studies involved asphalt samples taken from the field sites and samples created in the laboratory. These were tested in various ways, including, density, thickness, and time domain reflectometry (TDR). In the field, GPR data was acquired using a 1000 MHz air-launched unit and a ground-coupled unit at 200 and 500 MHz. The equipment used was owned and operated by the Michigan Department of Transportation (MDOT) and available for this study for a total of four days during summer 2005 and spring 2006. The analysis of the reflected waveforms included “routine” processing for velocity using commercial software and direct evaluation of reflection coefficients to determine a dielectric constant. The dielectric constants computed from velocities do not agree well with those obtained from reflection coefficients. Perhaps due to the limited range of asphalt types studied, no correlation between density and dielectric constant was evident. Laboratory measurements were taken with samples removed from the field and samples created for this study. Samples from the field were studied using TDR, in order to obtain dielectric constant directly, and these correlated well with the estimates made from reflection coefficients. Samples created in the laboratory were measured using 1000 MHz air-launched GPR, and 400 MHz ground-coupled GPR, each under both wet and dry conditions. On the basis of these observations, I conclude that dielectric constant of asphalt can be reliably measured from waveform amplitude analysis of GJPR data, based on the consistent agreement with that obtained in the laboratory using TDR. Because of the uniformity of asphalts studied here, any correlation between dielectric constant and density is not yet apparent.
Resumo:
Heat management in mines is a growing issue as mines expand physically in size and depth and as the infrastructure grows that is required to maintain them. Heat management is a concern as it relates to the health and safety of the workers as set by the regulations of governing bodies as well as the heat sensitive equipment that may be found throughout the mine workings. In order to reduce the exposure of working in hot environments there are engineering and management systems that can monitor and control the environmental conditions within the mine. The successful implementation of these methods can manage the downtime caused by heat stress environments, which can increase overall production. This thesis introduces an approach to monitoring and data based heat management. A case study is presented with an in depth approach to data collection. Data was collected for a period of up to and over one year. Continuous monitoring was conducted by equipment that was developed both commercially and within the mine site. The monitoring instrumentation was used to assess the environmental conditions found within the study area. Analysis of the data allowed for an engineering assessment of viable options in order to control and manage the environment heat stress. An option is developed and presented which allows for the greatest impact on the heat stress conditions within the case study area and is economically viable for the mine site.
Resumo:
Power efficiency is one of the most important constraints in the design of embedded systems since such systems are generally driven by batteries with limited energy budget or restricted power supply. In every embedded system, there are one or more processor cores to run the software and interact with the other hardware components of the system. The power consumption of the processor core(s) has an important impact on the total power dissipated in the system. Hence, the processor power optimization is crucial in satisfying the power consumption constraints, and developing low-power embedded systems. A key aspect of research in processor power optimization and management is “power estimation”. Having a fast and accurate method for processor power estimation at design time helps the designer to explore a large space of design possibilities, to make the optimal choices for developing a power efficient processor. Likewise, understanding the processor power dissipation behaviour of a specific software/application is the key for choosing appropriate algorithms in order to write power efficient software. Simulation-based methods for measuring the processor power achieve very high accuracy, but are available only late in the design process, and are often quite slow. Therefore, the need has arisen for faster, higher-level power prediction methods that allow the system designer to explore many alternatives for developing powerefficient hardware and software. The aim of this thesis is to present fast and high-level power models for the prediction of processor power consumption. Power predictability in this work is achieved in two ways: first, using a design method to develop power predictable circuits; second, analysing the power of the functions in the code which repeat during execution, then building the power model based on average number of repetitions. In the first case, a design method called Asynchronous Charge Sharing Logic (ACSL) is used to implement the Arithmetic Logic Unit (ALU) for the 8051 microcontroller. The ACSL circuits are power predictable due to the independency of their power consumption to the input data. Based on this property, a fast prediction method is presented to estimate the power of ALU by analysing the software program, and extracting the number of ALU-related instructions. This method achieves less than 1% error in power estimation and more than 100 times speedup in comparison to conventional simulation-based methods. In the second case, an average-case processor energy model is developed for the Insertion sort algorithm based on the number of comparisons that take place in the execution of the algorithm. The average number of comparisons is calculated using a high level methodology called MOdular Quantitative Analysis (MOQA). The parameters of the energy model are measured for the LEON3 processor core, but the model is general and can be used for any processor. The model has been validated through the power measurement experiments, and offers high accuracy and orders of magnitude speedup over the simulation-based method.
Resumo:
Development of an internet based spatial data delivery and reporting system for the Australian Cotton Industry.
Resumo:
Mode of access: Internet.
Resumo:
In the wake of findings from the Bundaberg Hospital and Forster inquiries in Queensland, periodic public release of hospital performance reports has been recommended. A process for developing and releasing such reports is being established by Queensland Health, overseen by an independent expert panel. This recommendation presupposes that public reports based on routinely collected administrative data are accurate; that the public can access, correctly interpret and act upon report contents; that reports motivate hospital clinicians and managers to improve quality of care; and that there are no unintended adverse effects of public reporting. Available research suggests that primary data sources are often inaccurate and incomplete, that reports have low predictive value in detecting outlier hospitals, and that users experience difficulty in accessing and interpreting reports and tend to distrust their findings.
Resumo:
The reliance on police data for the counting of road crash injuries can be problematic, as it is well known that not all road crash injuries are reported to police which under-estimates the overall burden of road crash injuries. The aim of this study was to use multiple linked data sources to estimate the extent of under-reporting of road crash injuries to police in the Australian state of Queensland. Data from the Queensland Road Crash Database (QRCD), the Queensland Hospital Admitted Patients Data Collection (QHAPDC), Emergency Department Information System (EDIS), and the Queensland Injury Surveillance Unit (QISU) for the year 2009 were linked. The completeness of road crash cases reported to police was examined via discordance rates between the police data (QRCD) and the hospital data collections. In addition, the potential bias of this discordance (under-reporting) was assessed based on gender, age, road user group, and regional location. Results showed that the level of under-reporting varied depending on the data set with which the police data was compared. When all hospital data collections are examined together the estimated population of road crash injuries was approximately 28,000, with around two-thirds not linking to any record in the police data. The results also showed that the under-reporting was more likely for motorcyclists, cyclists, males, young people, and injuries occurring in Remote and Inner Regional areas. These results have important implications for road safety research and policy in terms of: prioritising funding and resources; targeting road safety interventions into areas of higher risk; and estimating the burden of road crash injuries.
Resumo:
Data have been collected on fisheries catch and effort trends since the latter half of the 1800s. With current trends in declining stocks and stricter management regimes, data need to be collected and analyzed over shorter periods and at finer spatial resolution than in the past. New methods of electronic reporting may reduce the lag time in data collection and provide more accurate spatial resolution. In this study I evaluated the differences between fish dealer and vessel reporting systems for federal fisheries in the US New England and Mid-Atlantic areas. Using data on landing date, report date, gear used, port landed, number of hauls, number of fish sampled and species quotas from available catch and effort records I compared dealer and vessel electronically collected data against paper collected dealer and vessel data to determine if electronically collected data are timelier and more accurate. To determine if vessel or dealer electronic reporting is more useful for management, I determined differences in timeliness and accuracy between vessel and dealer electronic reports. I also compared the cost and efficiency of these new methods with less technology intensive reporting methods using available cost data and surveys of seafood dealers for cost information. Using this information I identified potentially unnecessary duplication of effort and identified applications in ecosystem-based fisheries management. This information can be used to guide the decisions of fisheries managers in the United States and other countries that are attempting to identify appropriate fisheries reporting methods for the management regimes under consideration. (PDF contains 370 pages)
Resumo:
A generalized Bayesian population dynamics model was developed for analysis of historical mark-recapture studies. The Bayesian approach builds upon existing maximum likelihood methods and is useful when substantial uncertainties exist in the data or little information is available about auxiliary parameters such as tag loss and reporting rates. Movement rates are obtained through Markov-chain Monte-Carlo (MCMC) simulation, which are suitable for use as input in subsequent stock assessment analysis. The mark-recapture model was applied to English sole (Parophrys vetulus) off the west coast of the United States and Canada and migration rates were estimated to be 2% per month to the north and 4% per month to the south. These posterior parameter distributions and the Bayesian framework for comparing hypotheses can guide fishery scientists in structuring the spatial and temporal complexity of future analyses of this kind. This approach could be easily generalized for application to other species and more data-rich fishery analyses.
Resumo:
This study evaluated the effect of an online diet-tracking tool on college students’ self-efficacy regarding fruit and vegetable intake. A convenience sample of students completed online self-efficacy surveys before and after a six-week intervention in which they tracked dietary intake with an online tool. Group one (n=22 fall, n=43 spring) accessed a tracking tool without nutrition tips; group two (n=20 fall, n=33 spring) accessed the tool and weekly nutrition tips. The control group (n=36 fall, n=60 spring) had access to neither. Each semester there were significant changes in self-efficacy from pre- to post-test for men and for women when experimental groups were combined (p<0.05 for all); however, these changes were inconsistent. Qualitative data showed that participants responded well to the simplicity of the tool, the immediacy of feedback, and the customized database containing foods available on campus. Future models should improve user engagement by increasing convenience, potentially by automation.
Resumo:
The definitive paper by Stuiver and Polach (1977) established the conventions for reporting of 14C data for chronological and geophysical studies based on the radioactive decay of 14C in the sample since the year of sample death or formation. Several ways of reporting 14C activity levels relative to a standard were also established, but no specific instructions were given for reporting nuclear weapons testing (post-bomb) 14C levels in samples. Because the use of post-bomb 14C is becoming more prevalent in forensics, biology, and geosciences, a convention needs to be adopted. We advocate the use of fraction modern with a new symbol F14C to prevent confusion with the previously used Fm, which may or may not have been fractionation corrected. We also discuss the calibration of post-bomb 14C samples and the available datasets and compilations, but do not give a recommendation for a particular dataset.
Resumo:
BACKGROUND Quantifying sexually transmitted infection (STI) prevalence and incidence is important for planning interventions and advocating for resources. The World Health Organization (WHO) periodically estimates global and regional prevalence and incidence of four curable STIs: chlamydia, gonorrhoea, trichomoniasis and syphilis. METHODS AND FINDINGS WHO's 2012 estimates were based upon literature reviews of prevalence data from 2005 through 2012 among general populations for genitourinary infection with chlamydia, gonorrhoea, and trichomoniasis, and nationally reported data on syphilis seroprevalence among antenatal care attendees. Data were standardized for laboratory test type, geography, age, and high risk subpopulations, and combined using a Bayesian meta-analytic approach. Regional incidence estimates were generated from prevalence estimates by adjusting for average duration of infection. In 2012, among women aged 15-49 years, the estimated global prevalence of chlamydia was 4.2% (95% uncertainty interval (UI): 3.7-4.7%), gonorrhoea 0.8% (0.6-1.0%), trichomoniasis 5.0% (4.0-6.4%), and syphilis 0.5% (0.4-0.6%); among men, estimated chlamydia prevalence was 2.7% (2.0-3.6%), gonorrhoea 0.6% (0.4-0.9%), trichomoniasis 0.6% (0.4-0.8%), and syphilis 0.48% (0.3-0.7%). These figures correspond to an estimated 131 million new cases of chlamydia (100-166 million), 78 million of gonorrhoea (53-110 million), 143 million of trichomoniasis (98-202 million), and 6 million of syphilis (4-8 million). Prevalence and incidence estimates varied by region and sex. CONCLUSIONS Estimates of the global prevalence and incidence of chlamydia, gonorrhoea, trichomoniasis, and syphilis in adult women and men remain high, with nearly one million new infections with curable STI each day. The estimates highlight the urgent need for the public health community to ensure that well-recognized effective interventions for STI prevention, screening, diagnosis, and treatment are made more widely available. Improved estimation methods are needed to allow use of more varied data and generation of estimates at the national level.