915 resultados para Data reporting
Resumo:
This report presents the results of a study exploring the law and practice of mandatory reporting of child abuse and neglect in Western Australia. Government administrative data over a decade (2003-2012) were accessed and analysed to map trends in reporting of different types of child abuse and neglect (physical abuse, sexual abuse, emotional abuse, and neglect) by different reporter groups (e.g., police, teachers, doctors, nurses, family members, neighbours), and the outcomes of these reports (whether investigated, and whether substantiated or not). The study was funded by the Australian Government and administered through the Government of Victoria.
Resumo:
Child sexual abuse is widespread and difficult to detect. To enhance case identification, many societies have enacted mandatory reporting laws requiring designated professionals, most often police, teachers, doctors and nurses, to report suspected cases to government child welfare agencies. Little research has explored the effects of introducing a reporting law on the number of reports made, and the outcomes of those reports. This study explored the impact of a new legislative mandatory reporting duty for child sexual abuse in the State of Western Australia over seven years. We analysed data about numbers and outcomes of reports by mandated reporters, for periods before the law (2006-08) and after the law (2009-12). Results indicate that the number of reports by mandated reporters of suspected child sexual abuse increased by a factor of 3.7, from an annual mean of 662 in the three year pre-law period to 2448 in the four year post-law period. The increase in the first two post-law years was contextually and statistically significant. Report numbers stabilised in 2010-12, at one report per 210 children. The number of investigated reports increased threefold, from an annual mean of 451 in the pre-law period to 1363 in the post-law period. Significant decline in the proportion of mandated reports that were investigated in the first two post-law years suggested the new level of reporting and investigative need exceeded what was anticipated. However, a subsequent significant increase restored the pre-law proportion, suggesting systemic adaptive capacity. The number of substantiated investigations doubled, from an annual mean of 160 in the pre-law period to 327 in the post-law period, indicating twice as many sexually abused children were being identified.
Resumo:
Corporate Social Responsibility (CSR) has become increasingly important topic in forest industries, and other global companies, in recent years. Globalisation, faster information delivery and demand for sustainable development have set new challenges for global companies in their business operations. Also the importance of stakeholder relations, and pressure to become more transparent has increased in the forest industries. Three dimensions of corporate responsibility economic, environmental and social, are often included in the concept of CSR. Global companies mostly claim that these dimensions are equally important. This study analyses CSR in forest industry and has focus on reporting and implementation of social responsibility in three international companies. These case-companies are Stora Enso, SCA and Sappi, and they have different geographical base, product portfolios and therefore present interesting differences about forest industry strategy and CSR. Global Reporting Initiative (GRI) has created the most known and used reporting framework in CSR reporting. GRI Guidelines have made CSR reporting a uniform function, which can also be measured between companies and different sectors. GRI Guidelines have also made it possible to record and control CSR data in the companies. In recent years the use of GRI Guidelines has increased substantially. Typically CSR reporting on economic and environmental responsibility have been systematic in the global companies and often driven by legistlation and other regulations. However the social responsibility has been less regulated and more difficult to compare. Therefore it has previously been often less focused in the CSR reporting of the global companies. The implementation and use of GRI Guidelines have also increased dialogue on social responsibility issues and stakeholder management in global companies. This study analyses the use of GRI´s framework in the forest industry companies´ CSR reporting. This is a qualitative study and the disclosure of data is empricially analysed using content analysis. Content analysis has been selected as a method for this study because it makes it possible to use different sources of information. The data of this study consists of existing academic literature of CSR, sustainability reports of thecase-companies during 2005-2009, and the semi-structured interviews with company representatives. Different sources provide the possibility to look at specific subject from more than one viewpoint. The results of the study show that all case-companies have relatively common themes in their CSR disclosure, and the differences rise mainly from their product-portfolios, and geographic base. Social impacts to local communities, in the CSR of the companies, were mainly dominated by issues concerning creating wealth to the society and impacting communities through creation of work. The comparability of the CSR reporting, and especially social indicators increased significally from 2007 onwards in all case-companies. Even though the companies claim that three dimensions of CSR economic, environmental and social are equally important economic issues and profit improvement still seem to drive most of the operations in the global companies. Many issues that are covered by laws and regulations are still essentially presented as social responsibility in CSR. However often the unwelcome issues in companies like closing operations are covered just briefly, and without adequate explanation. To make social responsibility equally important in the CSR it would demand more emphasis from all the case-companies. A lot of emphasis should be put especially on the detail and extensiveness of the social reponsibility content in the CSR.
Resumo:
This study discusses the scope of historical earthquake analysis in low-seismicity regions. Examples of non-damaging earthquake reports are given from the Eastern Baltic (Fennoscandian) Shield in north-eastern Europe from the 16th to the 19th centuries. The information available for past earthquakes in the region is typically sparse and cannot be increased through a careful search of the archives. This study applies recommended rigorous methodologies of historical seismology developed using ample data to the sparse reports from the Eastern Baltic Shield. Attention is paid to the context of reporting, the identity and role of the authors, the circumstances of the reporting, and the opportunity to verify the available information by collating the sources. We evaluate the reliability of oral earthquake recollections and develop criteria for cases when a historical earthquake is attested to by a single source. We propose parametric earthquake scenarios as a way to deal with sparse macroseismic reports and as an improvement to existing databases.
Resumo:
Sea level rise (SLR) assessments are commonly used to identify the extent that coastal populations are at risk to flooding. However, the data and assumptions used to develop these assessments contain numerous sources and types of uncertainty, which limit confidence in the accuracy of modeled results. This study illustrates how the intersection of uncertainty in digital elevation models (DEMs) and SLR lead to a wide range of modeled outcomes. SLR assessments are then reviewed to identify the extent that uncertainty is documented in peer-reviewed articles. The paper concludes by discussing priorities needed to further understand SLR impacts. (PDF contains 4 pages)
Resumo:
206 p.
Resumo:
A generalized Bayesian population dynamics model was developed for analysis of historical mark-recapture studies. The Bayesian approach builds upon existing maximum likelihood methods and is useful when substantial uncertainties exist in the data or little information is available about auxiliary parameters such as tag loss and reporting rates. Movement rates are obtained through Markov-chain Monte-Carlo (MCMC) simulation, which are suitable for use as input in subsequent stock assessment analysis. The mark-recapture model was applied to English sole (Parophrys vetulus) off the west coast of the United States and Canada and migration rates were estimated to be 2% per month to the north and 4% per month to the south. These posterior parameter distributions and the Bayesian framework for comparing hypotheses can guide fishery scientists in structuring the spatial and temporal complexity of future analyses of this kind. This approach could be easily generalized for application to other species and more data-rich fishery analyses.
Resumo:
Tagging experiments are a useful tool in fisheries for estimating mortality rates and abundance of fish. Unfortunately, nonreporting of recovered tags is a common problem in commercial fisheries which, if unaccounted for, can render these estimates meaningless. Observers are often employed to monitor a portion of the catches as a means of estimating reporting rates. In our study, observer data were incorporated into an integrated model for multiyear tagging and catch data to provide joint estimates of mortality rates (natural and f ishing), abundance, and reporting rates. Simulations were used to explore model performance under a range of scenarios (e.g., different parameter values, parameter constraints, and numbers of release and recapture years). Overall, results indicated that all parameters can be estimated with reasonable accuracy, but that fishing mortality, reporting rates, and abundance can be estimated with much higher precision than natural mortality. An example of how the model can be applied to provide guidance on experimental design for a large-scale tagging study is presented. Such guidance can contribute to the successful and cost-effective management of tagging programs for commercial fisheries.
Resumo:
The National Marine Fisheries Service is required by law to conduct social impact assessments of communities impacted by fishery management plans. To facilitate this process, we developed a technique for grouping communities based on common sociocultural attributes. Multivariate data reduction techniques (e.g. principal component analyses, cluster analyses) were used to classify Northeast U.S. fishing communities based on census and fisheries data. The comparisons indicate that the clusters represent real groupings that can be verified with the profiles. We then selected communities representative of different values on these multivariate dimensions for in-depth analysis. The derived clusters are then compared based on more detailed data from fishing community profiles. Ground-truthing (e.g. visiting the communities and collecting primary information) a sample of communities from three clusters (two overlapping geographically) indicates that the more remote techniques are sufficient for typing the communities for further in-depth analyses. The in-depth analyses provide additional important information which we contend is representative of all communities within the cluster.
Resumo:
A total of 1784 legal-size (≥356 mm TL) hatchery-produced red drum (Sciaenops ocellatus) were tagged and released to estimate tag-reporting levels of recreational anglers in South Carolina (SC) and Georgia (GA). Twelve groups of legal-size fish (~150 fish/group) were released. Half of the fish of each group were tagged with an external tag with the message “reward” and the other half of the fish were implanted with tags with the message “$100 reward.” These fish were released into two estuaries in each state (n=4); three replicate groups were released at different sites within each estuary (n=12). From results obtained in previous tag return experiments conducted by wildlife and fisheries biologists, it was hypothesized that reporting would be maximized at a reward level of $100/tag. Reporting level for the “reward” tags was estimated by dividing the number of “reward” tags returned by the number of “$100 reward” tags returned. The cumulative return level for both tag messages was 22.7 (±1.9)% in SC and 25.8 (±4.1)% in GA. These return levels were typical of those recorded by other red drum tagging programs in the region. Return data were partitioned according to verbal survey information obtained from anglers who reported tagged fish. Based on this partitioned data set, 14.3 (±2.1)% of “reward” tags were returned in SC, and 25.5 (±2.3)% of “$100 reward” tags were returned. This finding indicates that only 56.7% of the fish captured with “reward” tags were reported in SC. The pattern was similar for GA where 19.1 (±10.6)% of “reward” message tags were returned as compared with 30.1 (±15.6)% for “$100 reward” message tags. This difference yielded a reporting level of 63% for “reward” tags in GA. Currently, 50% is used as the estimate for the angler reporting level in population models for red drum and a number of other coastal finfish species in the South Atlantic region of the United States. Based on results of our study, the commonly used reporting estimate may result in an overestimate of angler exploitation for red drum.
Resumo:
Wireless sensor networks are characterized by limited energy resources. To conserve energy, application-specific aggregation (fusion) of data reports from multiple sensors can be beneficial in reducing the amount of data flowing over the network. Furthermore, controlling the topology by scheduling the activity of nodes between active and sleep modes has often been used to uniformly distribute the energy consumption among all nodes by de-synchronizing their activities. We present an integrated analytical model to study the joint performance of in-network aggregation and topology control. We define performance metrics that capture the tradeoffs among delay, energy, and fidelity of the aggregation. Our results indicate that to achieve high fidelity levels under medium to high event reporting load, shorter and fatter aggregation/routing trees (toward the sink) offer the best delay-energy tradeoff as long as topology control is well coordinated with routing.
Resumo:
BACKGROUND: Monogamy, together with abstinence, partner reduction, and condom use, is widely advocated as a key behavioral strategy to prevent HIV infection in sub-Saharan Africa. We examined the association between the number of sexual partners and the risk of HIV seropositivity among men and women presenting for HIV voluntary counseling and testing (VCT) in northern Tanzania. METHODOLOGY/ PRINCIPAL FINDINGS: Clients presenting for HIV VCT at a community-based AIDS service organization in Moshi, Tanzania were surveyed between November 2003 and December 2007. Data on sociodemographic characteristics, reasons for testing, sexual behaviors, and symptoms were collected. Men and women were categorized by number of lifetime sexual partners, and rates of seropositivity were reported by category. Factors associated with HIV seropositivity among monogamous males and females were identified by a multivariate logistic regression model. Of 6,549 clients, 3,607 (55%) were female, and the median age was 30 years (IQR 24-40). 939 (25%) females and 293 (10%) males (p<0.0001) were HIV seropositive. Among 1,244 (34%) monogamous females and 423 (14%) monogamous males, the risk of HIV infection was 19% and 4%, respectively (p<0.0001). The risk increased monotonically with additional partners up to 45% (p<0.001) and 15% (p<0.001) for women and men, respectively with 5 or more partners. In multivariate analysis, HIV seropositivity among monogamous women was most strongly associated with age (p<0.0001), lower education (p<0.004), and reporting a partner with other partners (p = 0.015). Only age was a significant risk factor for monogamous men (p = 0.0004). INTERPRETATION: Among women presenting for VCT, the number of partners is strongly associated with rates of seropositivity; however, even women reporting lifetime monogamy have a high risk for HIV infection. Partner reduction should be coupled with efforts to place tools in the hands of sexually active women to reduce their risk of contracting HIV.
Resumo:
BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.
Resumo:
This study evaluated the effect of an online diet-tracking tool on college students’ self-efficacy regarding fruit and vegetable intake. A convenience sample of students completed online self-efficacy surveys before and after a six-week intervention in which they tracked dietary intake with an online tool. Group one (n=22 fall, n=43 spring) accessed a tracking tool without nutrition tips; group two (n=20 fall, n=33 spring) accessed the tool and weekly nutrition tips. The control group (n=36 fall, n=60 spring) had access to neither. Each semester there were significant changes in self-efficacy from pre- to post-test for men and for women when experimental groups were combined (p<0.05 for all); however, these changes were inconsistent. Qualitative data showed that participants responded well to the simplicity of the tool, the immediacy of feedback, and the customized database containing foods available on campus. Future models should improve user engagement by increasing convenience, potentially by automation.
Resumo:
This paper examines the routine practice of Approved Social Workers (ASWs) in adult mental health services in Northern Ireland. It begins with a review of existing literature on the ASW role before describing how a retrospective audit, using a mixed methods approach, was used to collect data on eighty-four assessments carried out to determine whether compulsory admission to hospital was needed. Respondents were also asked to consider how such assessments might be affected by proposed changes to the law in this field. The key findings highlighted a number of areas of practice that may be improved. There were inconsistencies in how the assessments were recorded and an uneven distribution of workloads across ASWs. Some problems were identified with interagency working and, in a quarter of the assessments, the ASW reported having felt afraid or at risk. The authors make a number of recommendations, which include: the use of a standard reporting procedure; that organisations should consider how to deliver a more even distribution of ASW workload; that protocols should be developed that ensure that ASWs are not left alone in potentially risky situations; and that joint assessments with General Practitioners should be required, rather than just recommended.