70 resultados para diffractive methodology,


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Contrails and especially their evolution into cirrus-like clouds are thought to have very important effects on local and global radiation budgets, though are generally not well represented in global climate models. Lack of contrail parameterisations is due to the limited availability of in situ contrail measurements which are difficult to obtain. Here we present a methodology for successful sampling and interpretation of contrail microphysical and radiative data using both in situ and remote sensing instrumentation on board the FAAM BAe146 UK research aircraft as part of the COntrails Spreading Into Cirrus (COSIC) study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Wetland and Wetland CH4 Intercomparison of Models Project (WETCHIMP) was created to evaluate our present ability to simulate large-scale wetland characteristics and corresponding methane (CH4) emissions. A multi-model comparison is essential to evaluate the key uncertainties in the mechanisms and parameters leading to methane emissions. Ten modelling groups joined WETCHIMP to run eight global and two regional models with a common experimental protocol using the same climate and atmospheric carbon dioxide (CO2) forcing datasets. We reported the main conclusions from the intercomparison effort in a companion paper (Melton et al., 2013). Here we provide technical details for the six experiments, which included an equilibrium, a transient, and an optimized run plus three sensitivity experiments (temperature, precipitation, and atmospheric CO2 concentration). The diversity of approaches used by the models is summarized through a series of conceptual figures, and is used to evaluate the wide range of wetland extent and CH4 fluxes predicted by the models in the equilibrium run. We discuss relationships among the various approaches and patterns in consistencies of these model predictions. Within this group of models, there are three broad classes of methods used to estimate wetland extent: prescribed based on wetland distribution maps, prognostic relationships between hydrological states based on satellite observations, and explicit hydrological mass balances. A larger variety of approaches was used to estimate the net CH4 fluxes from wetland systems. Even though modelling of wetland extent and CH4 emissions has progressed significantly over recent decades, large uncertainties still exist when estimating CH4 emissions: there is little consensus on model structure or complexity due to knowledge gaps, different aims of the models, and the range of temporal and spatial resolutions of the models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many urban surface energy balance models now exist. These vary in complexity from simple schemes that represent the city as a concrete slab, to those which incorporate detailed representations of momentum and energy fluxes distributed within the atmospheric boundary layer. While many of these schemes have been evaluated against observations, with some models even compared with the same data sets, such evaluations have not been undertaken in a controlled manner to enable direct comparison. For other types of climate model, for instance the Project for Intercomparison of Land-Surface Parameterization Schemes (PILPS) experiments (Henderson-Sellers et al., 1993), such controlled comparisons have been shown to provide important insights into both the mechanics of the models and the physics of the real world. This paper describes the progress that has been made to date on a systematic and controlled comparison of urban surface schemes. The models to be considered, and their key attributes, are described, along with the methodology to be used for the evaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyse by simulation the impact of model-selection strategies (sometimes called pre-testing) on forecast performance in both constant-and non-constant-parameter processes. Restricted, unrestricted and selected models are compared when either of the first two might generate the data. We find little evidence that strategies such as general-to-specific induce significant over-fitting, or thereby cause forecast-failure rejection rates to greatly exceed nominal sizes. Parameter non-constancies put a premium on correct specification, but in general, model-selection effects appear to be relatively small, and progressive research is able to detect the mis-specifications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Anticoagulants rodenticides have already known for over half a century, as effective and safe method of rodent control. However, discovered in 1958 anticoagulant resistance has given us a very important problem for their future long-term use. Laboratory tests provide the main method for identification the different types of anticoagulant resistances, quantify the magnitude of their effect and help us to choose the best pest control strategy. The main important tests are lethal feeding period (LFP) and blood clotting response (BCR) tests. These tests can now be used to quantify the likely effect of the resistance on treatment outcome by providing an estimate of the ‘resistance factor’. In 2004 the gene responsible for anticoagulant resistance (VKORC1) was identified and sequenced. As a result, a new molecular resistance testing methodology has been developed, and a number of resistance mutations, particularly in Norway rats and house mice. Three mutations of the VKORC1 gene in Norway rats have been identified to date that confer a degree of resistance to bromadiolone and difenacoum, sufficient to affect treatment outcome in the field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper makes a theoretical case for using these two systems approaches together. The theoretical and methodological assumptions of system dynamics (SD) and soft system methodology (SSM) are briefly described and a partial critique is presented. SSM generates and represents diverse perspectives on a problem situation and addresses the socio-political elements of an intervention. However, it is weak in ensuring `dynamic coherence'. consistency between the intuitive behaviour resulting from proposed changes and behaviour deduced from ideas on causal structure. Conversely, SD examines causal structures and dynamic behaviours. However, whilst emphasising the need for a clear issue focus, it has little theory for generating and representing diverse issues. Also, there is no theory for facilitating sensitivity to socio-political elements. A synthesis of the two called ‘Holon Dynamics' is proposed. After an SSM intervention, a second stage continues the socio-political analysis and also operates within a new perspective which values dynamic coherence of the mental construct - the holon - which is capable of expressing the proposed changes. A model of this holon is constructed using SD and the changes are thus rendered `systemically desirable' in the additional sense that dynamic consistency has been confirmed. The paper closes with reflections on the proposal and the need for theoretical consistency when mixing tools is emphasised.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article reviews the experiences of a practising business consultancy division. It discusses the reasons for the failure of the traditional, expert consultancy approach and states the requirements for a more suitable consultancy methodology. An approach called ‘Modelling as Learning’ is introduced, its three defining aspects being: client ownership of all analytical work performed, consultant acting as facilitator and sensitivity to soft issues within and surrounding a problem. The goal of such an approach is set as the acceleration of the client's learning about the business. The tools that are used within this methodological framework are discussed and some case studies of the methodology are presented. It is argued that a learning experience was necessary before arriving at the new methodology but that it is now a valuable and significant component of the division's work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The WFDEI meteorological forcing data set has been generated using the same methodology as the widely used WATCH Forcing Data (WFD) by making use of the ERA-Interim reanalysis data. We discuss the specifics of how changes in the reanalysis and processing have led to improvement over the WFD. We attribute improvements in precipitation and wind speed to the latest reanalysis basis data and improved downward shortwave fluxes to the changes in the aerosol corrections. Covering 1979–2012, the WFDEI will allow more thorough comparisons of hydrological and Earth System model outputs with hydrologically and phenologically relevant satellite products than using the WFD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is an on-going debate on the environmental effects of genetically modified crops to which this paper aims to contribute. First, data on environmental impacts of genetically modified (GM) and conventional crops are collected from peer-reviewed journals, and secondly an analysis is conducted in order to examine which crop type is less harmful for the environment. Published data on environmental impacts are measured using an array of indicators, and their analysis requires their normalisation and aggregation. Taking advantage of composite indicators literature, this paper builds composite indicators to measure the impact of GM and conventional crops in three dimensions: (1) non-target key species richness, (2) pesticide use, and (3) aggregated environmental impact. The comparison between the three composite indicators for both crop types allows us to establish not only a ranking to elucidate which crop is more convenient for the environment but the probability that one crop type outperforms the other from an environmental perspective. Results show that GM crops tend to cause lower environmental impacts than conventional crops for the analysed indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The sustainable intelligent building is a building that has the best combination of environmental, social, economic and technical values. And its sustainability assessment is related with system engineering methods and multi-criteria decision-making. Therefore firstly, the wireless monitoring system of sustainable parameters for intelligent buildings is achieved; secondly, the indicators and key issues based on the “whole life circle” for sustainability of intelligent buildings are researched; thirdly, the sustainable assessment model identified on the structure entropy and fuzzy analytic hierarchy process is proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Canopy Height Profile (CHP) procedure presented in Harding et al. (2001) for large footprint LiDAR data was tested in a closed canopy environment as a way of extracting vertical foliage profiles from LiDAR raw-waveform. In this study, an adaptation of this method to small-footprint data has been shown, tested and validated in an Australian sparse canopy forest at plot- and site-level. Further, the methodology itself has been enhanced by implementing a dataset-adjusted reflectance ratio calculation according to Armston et al. (2013) in the processing chain, and tested against a fixed ratio of 0.5 estimated for the laser wavelength of 1550nm. As a by-product of the methodology, effective leaf area index (LAIe) estimates were derived and compared to hemispherical photography-derived values. To assess the influence of LiDAR aggregation area size on the estimates in a sparse canopy environment, LiDAR CHPs and LAIes were generated by aggregating waveforms to plot- and site-level footprints (plot/site-aggregated) as well as in 5m grids (grid-processed). LiDAR profiles were then compared to leaf biomass field profiles generated based on field tree measurements. The correlation between field and LiDAR profiles was very high, with a mean R2 of 0.75 at plot-level and 0.86 at site-level for 55 plots and the corresponding 11 sites. Gridding had almost no impact on the correlation between LiDAR and field profiles (only marginally improvement), nor did the dataset-adjusted reflectance ratio. However, gridding and the dataset-adjusted reflectance ratio were found to improve the correlation between raw-waveform LiDAR and hemispherical photography LAIe estimates, yielding the highest correlations of 0.61 at plot-level and of 0.83 at site-level. This proved the validity of the approach and superiority of dataset-adjusted reflectance ratio of Armston et al. (2013) over a fixed ratio of 0.5 for LAIe estimation, as well as showed the adequacy of small-footprint LiDAR data for LAIe estimation in discontinuous canopy forests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Incorporating an emerging therapy as a new randomisation arm in a clinical trial that is open to recruitment would be desirable to researchers, regulators and patients to ensure that the trial remains current, new treatments are evaluated as quickly as possible, and the time and cost for determining optimal therapies is minimised. It may take many years to run a clinical trial from concept to reporting within a rapidly changing drug development environment; hence, in order for trials to be most useful to inform policy and practice, it is advantageous for them to be able to adapt to emerging therapeutic developments. This paper reports a comprehensive literature review on methodologies for, and practical examples of, amending an ongoing clinical trial by adding a new treatment arm. Relevant methodological literature describing statistical considerations required when making this specific type of amendment is identified, and the key statistical concepts when planning the addition of a new treatment arm are extracted, assessed and summarised. For completeness, this includes an assessment of statistical recommendations within general adaptive design guidance documents. Examples of confirmatory ongoing trials designed within the frequentist framework that have added an arm in practice are reported; and the details of the amendment are reviewed. An assessment is made as to how well the relevant statistical considerations were addressed in practice, and the related implications. The literature review confirmed that there is currently no clear methodological guidance on this topic, but that guidance would be advantageous to help this efficient design amendment to be used more frequently and appropriately in practice. Eight confirmatory trials were identified to have added a treatment arm, suggesting that trials can benefit from this amendment and that it can be practically feasible; however, the trials were not always able to address the key statistical considerations, often leading to uninterpretable or invalid outcomes. If the statistical concepts identified within this review are considered and addressed during the design of a trial amendment, it is possible to effectively assess a new treatment arm within an ongoing trial without compromising the original trial outcomes.