205 resultados para statistical reports
Resumo:
A probabilistic method is proposed to evaluate voltage quality of grid-connected photovoltaic (PV) power systems. The random behavior of solar irradiation is described in statistical terms and the resulting voltage fluctuation probability distribution is then derived. Reactive power capabilities of the PV generators are then analyzed and their operation under constant power factor mode is examined. By utilizing the reactive power capability of the PV-generators to the full, it is shown that network voltage quality can be greatly enhanced.
Resumo:
Mandatory reporting laws have been created in many jurisdictions as a way of identifying cases of severe child maltreatment on the basis that cases will otherwise remain hidden. These laws usually apply to all four maltreatment types. Other jurisdictions have narrower approaches supplemented by differential response systems, and others still have chosen not to enact mandatory reporting laws for any type of maltreatment. In scholarly research and normative debates about mandatory reporting laws and their effects, the four major forms of child maltreatment—physical abuse, sexual abuse, emotional abuse, and neglect—are often grouped together as if they are homogenous in nature, cause, and consequence. Yet, the heterogeneity of maltreatment types, and different reporting practices regarding them, must be acknowledged and explored when considering what legal and policy frameworks are best suited to identify and respond to cases. A related question which is often conjectured upon but seldom empirically explored, is whether reporting laws make a difference in case identification. This article first considers different types of child abuse and neglect, before exploring the nature and operation of mandatory reporting laws in different contexts. It then posits a differentiation thesis, arguing that different patterns of reporting between both reporter groups and maltreatment types must be acknowledged and analysed, and should inform discussions and assessments of optimal approaches in law, policy and practice. Finally, to contribute to the evidence base required to inform discussion, this article conducts an empirical cross-jurisdictional comparison of the reporting and identification of child sexual abuse in jurisdictions with and withoutmandatory reporting, and concludes that mandatory reporting laws appear to be associated with better case identification.
Resumo:
A cohort of 59 persons with industrial handling of low levels of acrylonitrile is being studied as part of a medical surveillance programme. Previously, an extended haemoglobin adduct monitoring (N-(cyanoethyl)valine and N-(hydroxyethyl)-valine) was performed regarding the glutathione transferases hGSTM1 and hGSTT1 polymorphisms but no influence of hGSTM1 or hGSTT1 polymorphisms on specific adduct levels was found. A compilation of case reports of human accidental poisonings had pointed to significant individual differences in human acrylonitrile metabolism and toxicity. Therefore, a re-evaluation of the industrial cohort included known polymorphisms of the glutathione transferases hGSTM3 and hGSTP1 as well as of the cytochrome P450 CYP2E1. A detailed statistical analysis revealed that exposed carriers of the allelic variants of hGSTP1, hGSTP1*B/hGSTP1*C, characterized by a single nucleotide polymorphism at nucleotide 313 which results in a change from Ile to Val at codon 104, had higher levels of the acrylonitrile-specific haemoglobin adduct N-(cyanoethyl)valine compared to the carriers of the codon 113 alleles hGSTP1*A and hGSTP1*D. The single nucleotide polymorphism at codon 113 of hGSTP1 (hGSTP1*A/hGSTP1*B versus hGSTP1*C/hGSTP1*D) did not show an effect, and also no influence was seen on specific haemoglobin adduct levels of the polymorphisms of hGSTM3 or CYP2E1. The data, therefore, point to a possible influence of a human enzyme polymorphism of the GSTP1 gene at codon 104 on the detoxication of acrylonitrile which calls for experimental toxicological investigation. The study also confirmed the impact of GSTT1 polymorphism on background N-(hydroxyethyl)-valine adduct levels in haemoglobin which are caused by endogenous ethylene oxide.
Resumo:
The melting temperature of a nanoscaled particle is known to decrease as the curvature of the solid-melt interface increases. This relationship is most often modelled by a Gibbs--Thomson law, with the decrease in melting temperature proposed to be a product of the curvature of the solid-melt interface and the surface tension. Such a law must break down for sufficiently small particles, since the curvature becomes singular in the limit that the particle radius vanishes. Furthermore, the use of this law as a boundary condition for a Stefan-type continuum model is problematic because it leads to a physically unrealistic form of mathematical blow-up at a finite particle radius. By numerical simulation, we show that the inclusion of nonequilibrium interface kinetics in the Gibbs--Thomson law regularises the continuum model, so that the mathematical blow up is suppressed. As a result, the solution continues until complete melting, and the corresponding melting temperature remains finite for all time. The results of the adjusted model are consistent with experimental findings of abrupt melting of nanoscaled particles. This small-particle regime appears to be closely related to the problem of melting a superheated particle.
Resumo:
This paper provides a preliminary summary of audit reports for Australian listed public companies for the period 2005 to 2013, focusing on auditor reporting in the most recent period 2011 to 2013. Prior research has shown that audit reports modified for uncertainty relating to the going concern assumption increased following the shock of the Global Financial Crisis (GFC) in late 2007. This occurred in Australia from 2008 where Xu et al. (2011) find that reports modified for going concern uncertainty increase from 12% in 2005 to 2007 to 18% in 2008 and 22% in 2009. Similar trends are observable for the United States as shown by an increase from 14% in 2003 to 21% in 2008 (Cheffers et al. 2010, Geiger et al. 2014). The aim of this report is to examine the frequency of the various types of audit reports issued in Australia during the period 2011 to 2013, with a focus on reports emphasizing significant uncertainty in regard to the going concern assumption.
Resumo:
This report presents the final deliverable from the project titled Conceptual and statistical framework for a water quality component of an integrated report card’ funded by the Marine and Tropical Sciences Research Facility (MTSRF; Project 3.7.7). The key management driver of this, and a number of other MTSRF projects concerned with indicator development, is the requirement for state and federal government authorities and other stakeholders to provide robust assessments of the present ‘state’ or ‘health’ of regional ecosystems in the Great Barrier Reef (GBR) catchments and adjacent marine waters. An integrated report card format, that encompasses both biophysical and socioeconomic factors, is an appropriate framework through which to deliver these assessments and meet a variety of reporting requirements. It is now well recognised that a ‘report card’ format for environmental reporting is very effective for community and stakeholder communication and engagement, and can be a key driver in galvanising community and political commitment and action. Although a report card it needs to be understandable by all levels of the community, it also needs to be underpinned by sound, quality-assured science. In this regard this project was to develop approaches to address the statistical issues that arise from amalgamation or integration of sets of discrete indicators into a final score or assessment of the state of the system. In brief, the two main issues are (1) selecting, measuring and interpreting specific indicators that vary both in space and time, and (2) integrating a range of indicators in such a way as to provide a succinct but robust overview of the state of the system. Although there is considerable research and knowledge of the use of indicators to inform the management of ecological, social and economic systems, methods on how to best to integrate multiple disparate indicators remain poorly developed. Therefore the objective of this project was to (i) focus on statistical approaches aimed at ensuring that estimates of individual indicators are as robust as possible, and (ii) present methods that can be used to report on the overall state of the system by integrating estimates of individual indicators. It was agreed at the outset, that this project was to focus on developing methods for a water quality report card. This was driven largely by the requirements of Reef Water Quality Protection Plan (RWQPP) and led to strong partner engagement with the Reef Water Quality Partnership.
Resumo:
Anthropogenic elemental mercury (Hg0) emission is a serious worldwide environmental problem due to the extreme toxicity of the heavy metal to humans, plants and wildlife. Development of an accurate and cheap microsensor based online monitoring system which can be integrated as part of Hg0 removal and control processes in industry is still a major challenge. Here, we demonstrate that forming Au nanospike structures directly onto the electrodes of a quartz crystal microbalance (QCM) using a novel electrochemical route results in a self-regenerating, highly robust, stable, sensitive and selective Hg0 vapor sensor. The data from a 127 day continuous test performed in the presence of volatile organic compounds and high humidity levels, showed that the sensor with an electrodeposted sensitive layer had 260% higher response magnitude, 3.4 times lower detection limit (,22 mg/m3 or ,2.46 ppbv) and higher accuracy (98% Vs 35%) over a Au control based QCM (unmodified) when exposed to a Hg0 vapor concentration of 10.55 mg/m3 at 1016C. Statistical analysis of the long term data showed that the nano-engineered Hg0 sorption sites on the developed Au nanospikes sensitive layer play a critical role in the enhanced sensitivity and selectivity of the developed sensor towards Hg0 vapor.
Resumo:
This thesis proposes three novel models which extend the statistical methodology for motor unit number estimation, a clinical neurology technique. Motor unit number estimation is important in the treatment of degenerative muscular diseases and, potentially, spinal injury. Additionally, a recent and untested statistic to enable statistical model choice is found to be a practical alternative for larger datasets. The existing methods for dose finding in dual-agent clinical trials are found to be suitable only for designs of modest dimensions. The model choice case-study is the first of its kind containing interesting results using so-called unit information prior distributions.
Resumo:
There is a wide range of potential study designs for intervention studies to decrease nosocomial infections in hospitals. The analysis is complex due to competing events, clustering, multiple timescales and time-dependent period and intervention variables. This review considers the popular pre-post quasi-experimental design and compares it with randomized designs. Randomization can be done in several ways: randomization of the cluster [intensive care unit (ICU) or hospital] in a parallel design; randomization of the sequence in a cross-over design; and randomization of the time of intervention in a stepped-wedge design. We introduce each design in the context of nosocomial infections and discuss the designs with respect to the following key points: bias, control for nonintervention factors, and generalizability. Statistical issues are discussed. A pre-post-intervention design is often the only choice that will be informative for a retrospective analysis of an outbreak setting. It can be seen as a pilot study with further, more rigorous designs needed to establish causality. To yield internally valid results, randomization is needed. Generally, the first choice in terms of the internal validity should be a parallel cluster randomized trial. However, generalizability might be stronger in a stepped-wedge design because a wider range of ICU clinicians may be convinced to participate, especially if there are pilot studies with promising results. For analysis, the use of extended competing risk models is recommended.
Resumo:
Yield in cultivated cotton (Gossypium spp.) is affected by the number and distribution of fibres initiated on the seed surface but, apart from simple statistical summaries, little has been done to assess this phenotype quantitatively. Here we use two types of spatial statistics to describe and quantify differences in patterning of cotton ovule fibre initials (FI). The following five different species of Gossypium were analysed: G. hirsutum L., G. barbadense L., G. arboreum, G. raimondii Ulbrich. and G. trilobum (DC.) Skovsted. Scanning electron micrographs of FIs were taken on the day of anthesis. Cell centres for fibre and epidermal cells were digitised and analysed by spatial statistics methods appropriate for marked point processes and tessellations. Results were consistent with previously published reports of fibre number and spacing. However, it was shown that the spatial distributions of FIs in all of species examined exhibit regularity, and are not completely random as previously implied. The regular arrangement indicates FIs do not appear independently of each other and we surmise there may be some form of mutual inhibition specifying fibre-initial development. It is concluded that genetic control of FIs differs from that of stomata, another well studied plant idioblast. Since spatial statistics show clear species differences in the distribution of FIs within this genus, they provide a useful method for phenotyping cotton. © CSIRO 2007.