964 resultados para statistical framework


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the need for a framework for social enterprises to measure and report on social performance. Reviewing social reporting practice, and concepts central to financial reporting, this paper presents a framework for social performance reporting in the context of social enterprises. A Statement of Social Performance is developed, through consideration of social reporting approaches, influences, and issues in third sector and private sector organisations. This Statement is applied in the context of an employment and training social enterprise, demonstrating its application in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Targets for improvements in water quality entering the Great Barrier Reef (GBR) have been set through the Reef Water Quality Protection Plan (Reef Plan). To measure and report on progress towards the targets set a program has been established that combines monitoring and modelling at paddock through to catchment and reef scales; the Paddock to Reef Integrated Monitoring, Modelling and Reporting Program (Paddock to Reef Program). This program aims to provide evidence of links between land management activities, water quality and reef health. Five lines of evidence are used: the effectiveness of management practices to improve water quality; the prevalence of management practice adoption and change in catchment indicators; long-term monitoring of catchment water quality; paddock & catchment modelling to provide a relative assessment of progress towards meeting targets; and finally marine monitoring of GBR water quality and reef ecosystem health. This paper outlines the first four lines of evidence. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a conceptual model based on person–environment interaction, job performance, and motivational theories to structure a multilevel review of the employee green behavior (EGB) literature and agenda for future research. We differentiate between required EGB prescribed by the organization and voluntary EGB performed at the employees’ discretion. The review investigates institutional-, organizational-, leader-, team-, and employee-level antecedents and outcomes of EGB and factors that mediate and moderate these relationships. We offer suggestions to facilitate the development of the field, and call for future research to adopt a multilevel perspective and to investigate the outcomes of EGB.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tillering determines the plant size of sorghum (Sorghum bicolor) and an understanding of its regulation is important to match genotypes to prevalent growing conditions in target production environments. The aim of this study was to determine the physiological and environmental regulation of variability in tillering among sorghum genotypes, and to develop a framework for this regulation. * Diverse sorghum genotypes were grown in three experiments with contrasting temperature, radiation and plant density to create variation in tillering. Data on phenology, tillering, and leaf and plant size were collected. A carbohydrate supply/demand (S/D) index that incorporated environmental and genotypic parameters was developed to represent the effects of assimilate availability on tillering. Genotypic differences in tillering not explained by this index were defined as propensity to tiller (PTT) and probably represented hormonal effects. * Genotypic variation in tillering was associated with differences in leaf width, stem diameter and PTT. The S/D index captured most of the environmental effects on tillering and PTT most of the genotypic effects. * A framework that captures genetic and environmental regulation of tillering through assimilate availability and PTT was developed, and provides a basis for the development of a model that connects genetic control of tillering to its phenotypic consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aims We combine measurements of weak gravitational lensing from the CFHTLS-Wide survey, supernovae Ia from CFHT SNLS and CMB anisotropies from WMAP5 to obtain joint constraints on cosmological parameters, in particular, the dark-energy equation-of-state parameter w. We assess the influence of systematics in the data on the results and look for possible correlations with cosmological parameters. Methods We implemented an MCMC algorithm to sample the parameter space of a flat CDM model with a dark-energy component of constant w. Systematics in the data are parametrised and included in the analysis. We determine the influence of photometric calibration of SNIa data on cosmological results by calculating the response of the distance modulus to photometric zero-point variations. The weak lensing data set is tested for anomalous field-to-field variations and a systematic shape measurement bias for high-redshift galaxies. Results Ignoring photometric uncertainties for SNLS biases cosmological parameters by at most 20% of the statistical errors, using supernovae alone; the parameter uncertainties are underestimated by 10%. The weak-lensing field-to-field variance between 1 deg2-MegaCam pointings is 5-15% higher than predicted from N-body simulations. We find no bias in the lensing signal at high redshift, within the framework of a simple model, and marginalising over cosmological parameters. Assuming a systematic underestimation of the lensing signal, the normalisation increases by up to 8%. Combining all three probes we obtain -0.10 < 1 + w < 0.06 at 68% confidence ( -0.18 < 1 + w < 0.12 at 95%), including systematic errors. Our results are therefore consistent with the cosmological constant . Systematics in the data increase the error bars by up to 35%; the best-fit values change by less than 0.15.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Snapper (Pagrus auratus) is widely distributed throughout subtropical and temperate southern oceans and forms a significant recreational and commercial fishery in Queensland, Australia. Using data from government reports, media sources, popular publications and a government fisheries survey carried out in 1910, we compiled information on individual snapper fishing trips that took place prior to the commencement of fisherywide organized data collection, from 1871 to 1939. In addition to extracting all available quantitative data, we translated qualitative information into bounded estimates and used multiple imputation to handle missing values, forming 287 records for which catch rate (snapper fisher−1 h−1) could be derived. Uncertainty was handled through a parametric maximum likelihood framework (a transformed trivariate Gaussian), which facilitated statistical comparisons between data sources. No statistically significant differences in catch rates were found among media sources and the government fisheries survey. Catch rates remained stable throughout the time series, averaging 3.75 snapper fisher−1 h−1 (95% confidence interval, 3.42–4.09) as the fishery expanded into new grounds. In comparison, a contemporary (1993–2002) south-east Queensland charter fishery produced an average catch rate of 0.4 snapper fisher−1 h−1 (95% confidence interval, 0.31–0.58). These data illustrate the productivity of a fishery during its earliest years of development and represent the earliest catch rate data globally for this species. By adopting a formalized approach to address issues common to many historical records – missing data, a lack of quantitative information and reporting bias – our analysis demonstrates the potential for historical narratives to contribute to contemporary fisheries management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extreme vibration has been reported for small, high speed craft in the maritime sector, with performance and health threatening effects on boat operators and crew. Musculoskeletal injuries are an enduring problem for high speed craft passengers. Spinal or joint injuries and neurological disorders may occur from repetitive pounding over rough water, continued vibration and single impact events. The risk from whole body vibration (WBV) induced through the small vessels mainly depends on time spent on the craft, which can’t be changed in a military scenario; as well as the number of shocks and jolts, and their magnitude and frequency. In the European Union for example, physical agents directives require all employers to control exposure to a number of physical agents including noise and vibration. The EC Vibration Directive 2002/44/EC then sets out regulations for the control of health and safety risks from the exposure of workers to hand arm vibration (HAV) and WBV in the workplace. Australia has exposure standards relating to WBV, AS 2670.1-2001 – Evaluation of human exposure to whole body vibration. This standard is identical to the ISO 2631-1:1997, Mechanical vibration and shock – Evaluation of human exposure to whole-body vibration. Currently, none of the jurisdictions in Australia have specific regulations for vibration exposures in workplaces. However vibration is mentioned to varying degrees in their general regulations, codes of practice and guidance material. WBV on high speed craft is normally caused by “continuous 'hammering' from short steep seas or wind against tide conditions. Shock on High Speed Craft is usually caused by random impacts. Military organisations need the knowledge to make informed decisions regarding their marine operations, compliance with legislation and potentially harmful health effects, and develop and implement appropriate counter-measures. Marine case studies in the UK such as published MAIB (Marine Accident Investigation Branch) reports show injuries that have occurred in operation, and subsequent MCA (Maritime Coastguard Agency) guidance is provided (MGN 436 (M+F), WHOLE-BODY VIBRATION: Guidance on Mitigating Against the Effects of Shocks and Impacts on Small Vessels. MCA, 2011). This paper proposes a research framework to study the origin, impact and pathways for prevention of WBV in small, high speed craft in a maritime environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Determination of the environmental factors controlling earth surface processes and landform patterns is one of the central themes in physical geography. However, the identification of the main drivers of the geomorphological phenomena is often challenging. Novel spatial analysis and modelling methods could provide new insights into the process-environment relationships. The objective of this research was to map and quantitatively analyse the occurrence of cryogenic phenomena in subarctic Finland. More precisely, utilising a grid-based approach the distribution and abundance of periglacial landforms were modelled to identify important landscape scale environmental factors. The study was performed using a comprehensive empirical data set of periglacial landforms from an area of 600 km2 at a 25-ha resolution. The utilised statistical methods were generalized linear modelling (GLM) and hierarchical partitioning (HP). GLMs were used to produce distribution and abundance models and HP to reveal independently the most likely causal variables. The GLM models were assessed utilising statistical evaluation measures, prediction maps, field observations and the results of HP analyses. A total of 40 different landform types and subtypes were identified. Topographical, soil property and vegetation variables were the primary correlates for the occurrence and cover of active periglacial landforms on the landscape scale. In the model evaluation, most of the GLMs were shown to be robust although the explanation power, prediction ability as well as the selected explanatory variables varied between the models. The great potential of the combination of a spatial grid system, terrain data and novel statistical techniques to map the occurrence of periglacial landforms was demonstrated in this study. GLM proved to be a useful modelling framework for testing the shapes of the response functions and significances of the environmental variables and the HP method helped to make better deductions of the important factors of earth surface processes. Hence, the numerical approach presented in this study can be a useful addition to the current range of techniques available to researchers to map and monitor different geographical phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With few exceptions, the bulk of the collection pertains to the work of the Agro-Joint. Records of the Agro-Joint Director General. Agreements of the American Relief Administration (ARA) and the Joint Distribution Committee with the Soviet government, 1922-1923. Agreements between the Agro-Joint and the Soviet government, 1924, 1927, 1928. Agreements of the Agro-Joint and the American Society for Jewish Farm Settlements (ASJFS) with the Soviet government, 1929, 1930, 1933, 1938. Materials relating to relief work of the JDC within the framework of the American Relief Administration, 1922, including the appointment of J. Rosen as the JDC representative at the ARA. Statistics, reports, miscellaneous correspondence relating to JDC activities in Russia. Minutes, memos, reports, legal documents, certificate of incorporation, and general correspondence relating to the ASJFS, its formation, fund-raising activities, 1927-1939. Records of the Agro-Joint Main Office, Moscow. Annual and periodi c reports of the Agro-Joint including statistics, financial estimates, financial reports, analyses of expenditures, relating to Agro-Joint work, 1924-1937. General correspondence files: incoming and outgoing letters, reports, and memoranda. Materials relating to land surveys and allocations in the Crimea: statistics, surveys, memos, correspondence, relating to the Salsk district, Chernomor district, Changar peninsula, Azov, Kuban, Odessa district, Samara district, Povolzhe, Krivoy Rog, Kherson, The Far East, Siberia. Materials relating to contacts with KOMZET. Correspondence, minutes of KOMZET meetings, statistical information, reports. By-laws of the OZET (Obshchestvo po Zemleustroystvu Trudyachtchikhsya Evreev - Association For the Settlement of Toiling Jews On Land) and AGRO-KUSTBANK (Evreysky Agrarno-Kustarny Bank - Jewish Agricultural and House Workers Bank). Register of Agro-Joint assets transferred to KOMZET. Records of the Agro-Joint Agricultural Department. Materials

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sequential firings with fixed time delays are frequently observed in simultaneous recordings from multiple neurons. Such temporal patterns are potentially indicative of underlying microcircuits and it is important to know when a repeatedly occurring pattern is statistically significant. These sequences are typically identified through correlation counts. In this paper we present a method for assessing the significance of such correlations. We specify the null hypothesis in terms of a bound on the conditional probabilities that characterize the influence of one neuron on another. This method of testing significance is more general than the currently available methods since under our null hypothesis we do not assume that the spiking processes of different neurons are independent. The structure of our null hypothesis also allows us to rank order the detected patterns. We demonstrate our method on simulated spike trains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cesium hydrogen l-malate monohydrate, CsH(C4H4O5)·H2O, is a new chiral open-framework semi-organic crystalline material with a second-harmonic generation efficiency one order of magnitude greater than KDP. Single crystals of this new material have been grown by the conventional slow cooling technique from aqueous solution. Grown crystals display both platy and prismatic morphologies depending on the imposed supersaturation. Hardness values measured using Vickers hardness indenter show considerable anisotropy. The resistivity behavior at room temperature and above, places the crystal between an ionic conductor and a dielectric. The single-crystal SHG efficiency estimated through Maker fringes experiment gives deff which is 4.24 times that of KDP. Single and multiple shot experiments performed on the grown crystals for the fundamental and second harmonic of pulsed Nd:YAG laser (1064 and 532 nm) show that it exhibits a high laser damage threshold which is a favorable property for nonlinear optical applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Glaucoma is the second leading cause of blindness worldwide. Often, the optic nerve head (ONH) glaucomatous damage and ONH changes occur prior to visual field loss and are observable in vivo. Thus, digital image analysis is a promising choice for detecting the onset and/or progression of glaucoma. In this paper, we present a new framework for detecting glaucomatous changes in the ONH of an eye using the method of proper orthogonal decomposition (POD). A baseline topograph subspace was constructed for each eye to describe the structure of the ONH of the eye at a reference/baseline condition using POD. Any glaucomatous changes in the ONH of the eye present during a follow-up exam were estimated by comparing the follow-up ONH topography with its baseline topograph subspace representation. Image correspondence measures of L-1-norm and L-2-norm, correlation, and image Euclidean distance (IMED) were used to quantify the ONH changes. An ONH topographic library built from the Louisiana State University Experimental Glaucoma study was used to evaluate the performance of the proposed method. The area under the receiver operating characteristic curves (AUCs) was used to compare the diagnostic performance of the POD-induced parameters with the parameters of the topographic change analysis (TCA) method. The IMED and L-2-norm parameters in the POD framework provided the highest AUC of 0.94 at 10 degrees. field of imaging and 0.91 at 15 degrees. field of imaging compared to the TCA parameters with an AUC of 0.86 and 0.88, respectively. The proposed POD framework captures the instrument measurement variability and inherent structure variability and shows promise for improving our ability to detect glaucomatous change over time in glaucoma management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.