816 resultados para Systemic Methodology
Resumo:
This paper makes a theoretical case for using these two systems approaches together. The theoretical and methodological assumptions of system dynamics (SD) and soft system methodology (SSM) are briefly described and a partial critique is presented. SSM generates and represents diverse perspectives on a problem situation and addresses the socio-political elements of an intervention. However, it is weak in ensuring `dynamic coherence'. consistency between the intuitive behaviour resulting from proposed changes and behaviour deduced from ideas on causal structure. Conversely, SD examines causal structures and dynamic behaviours. However, whilst emphasising the need for a clear issue focus, it has little theory for generating and representing diverse issues. Also, there is no theory for facilitating sensitivity to socio-political elements. A synthesis of the two called ‘Holon Dynamics' is proposed. After an SSM intervention, a second stage continues the socio-political analysis and also operates within a new perspective which values dynamic coherence of the mental construct - the holon - which is capable of expressing the proposed changes. A model of this holon is constructed using SD and the changes are thus rendered `systemically desirable' in the additional sense that dynamic consistency has been confirmed. The paper closes with reflections on the proposal and the need for theoretical consistency when mixing tools is emphasised.
Resumo:
This article reviews the experiences of a practising business consultancy division. It discusses the reasons for the failure of the traditional, expert consultancy approach and states the requirements for a more suitable consultancy methodology. An approach called ‘Modelling as Learning’ is introduced, its three defining aspects being: client ownership of all analytical work performed, consultant acting as facilitator and sensitivity to soft issues within and surrounding a problem. The goal of such an approach is set as the acceleration of the client's learning about the business. The tools that are used within this methodological framework are discussed and some case studies of the methodology are presented. It is argued that a learning experience was necessary before arriving at the new methodology but that it is now a valuable and significant component of the division's work.
Resumo:
The WFDEI meteorological forcing data set has been generated using the same methodology as the widely used WATCH Forcing Data (WFD) by making use of the ERA-Interim reanalysis data. We discuss the specifics of how changes in the reanalysis and processing have led to improvement over the WFD. We attribute improvements in precipitation and wind speed to the latest reanalysis basis data and improved downward shortwave fluxes to the changes in the aerosol corrections. Covering 1979–2012, the WFDEI will allow more thorough comparisons of hydrological and Earth System model outputs with hydrologically and phenologically relevant satellite products than using the WFD.
Resumo:
There is an on-going debate on the environmental effects of genetically modified crops to which this paper aims to contribute. First, data on environmental impacts of genetically modified (GM) and conventional crops are collected from peer-reviewed journals, and secondly an analysis is conducted in order to examine which crop type is less harmful for the environment. Published data on environmental impacts are measured using an array of indicators, and their analysis requires their normalisation and aggregation. Taking advantage of composite indicators literature, this paper builds composite indicators to measure the impact of GM and conventional crops in three dimensions: (1) non-target key species richness, (2) pesticide use, and (3) aggregated environmental impact. The comparison between the three composite indicators for both crop types allows us to establish not only a ranking to elucidate which crop is more convenient for the environment but the probability that one crop type outperforms the other from an environmental perspective. Results show that GM crops tend to cause lower environmental impacts than conventional crops for the analysed indicators.
Resumo:
The sustainable intelligent building is a building that has the best combination of environmental, social, economic and technical values. And its sustainability assessment is related with system engineering methods and multi-criteria decision-making. Therefore firstly, the wireless monitoring system of sustainable parameters for intelligent buildings is achieved; secondly, the indicators and key issues based on the “whole life circle” for sustainability of intelligent buildings are researched; thirdly, the sustainable assessment model identified on the structure entropy and fuzzy analytic hierarchy process is proposed.
Resumo:
A Canopy Height Profile (CHP) procedure presented in Harding et al. (2001) for large footprint LiDAR data was tested in a closed canopy environment as a way of extracting vertical foliage profiles from LiDAR raw-waveform. In this study, an adaptation of this method to small-footprint data has been shown, tested and validated in an Australian sparse canopy forest at plot- and site-level. Further, the methodology itself has been enhanced by implementing a dataset-adjusted reflectance ratio calculation according to Armston et al. (2013) in the processing chain, and tested against a fixed ratio of 0.5 estimated for the laser wavelength of 1550nm. As a by-product of the methodology, effective leaf area index (LAIe) estimates were derived and compared to hemispherical photography-derived values. To assess the influence of LiDAR aggregation area size on the estimates in a sparse canopy environment, LiDAR CHPs and LAIes were generated by aggregating waveforms to plot- and site-level footprints (plot/site-aggregated) as well as in 5m grids (grid-processed). LiDAR profiles were then compared to leaf biomass field profiles generated based on field tree measurements. The correlation between field and LiDAR profiles was very high, with a mean R2 of 0.75 at plot-level and 0.86 at site-level for 55 plots and the corresponding 11 sites. Gridding had almost no impact on the correlation between LiDAR and field profiles (only marginally improvement), nor did the dataset-adjusted reflectance ratio. However, gridding and the dataset-adjusted reflectance ratio were found to improve the correlation between raw-waveform LiDAR and hemispherical photography LAIe estimates, yielding the highest correlations of 0.61 at plot-level and of 0.83 at site-level. This proved the validity of the approach and superiority of dataset-adjusted reflectance ratio of Armston et al. (2013) over a fixed ratio of 0.5 for LAIe estimation, as well as showed the adequacy of small-footprint LiDAR data for LAIe estimation in discontinuous canopy forests.
Resumo:
Incorporating an emerging therapy as a new randomisation arm in a clinical trial that is open to recruitment would be desirable to researchers, regulators and patients to ensure that the trial remains current, new treatments are evaluated as quickly as possible, and the time and cost for determining optimal therapies is minimised. It may take many years to run a clinical trial from concept to reporting within a rapidly changing drug development environment; hence, in order for trials to be most useful to inform policy and practice, it is advantageous for them to be able to adapt to emerging therapeutic developments. This paper reports a comprehensive literature review on methodologies for, and practical examples of, amending an ongoing clinical trial by adding a new treatment arm. Relevant methodological literature describing statistical considerations required when making this specific type of amendment is identified, and the key statistical concepts when planning the addition of a new treatment arm are extracted, assessed and summarised. For completeness, this includes an assessment of statistical recommendations within general adaptive design guidance documents. Examples of confirmatory ongoing trials designed within the frequentist framework that have added an arm in practice are reported; and the details of the amendment are reviewed. An assessment is made as to how well the relevant statistical considerations were addressed in practice, and the related implications. The literature review confirmed that there is currently no clear methodological guidance on this topic, but that guidance would be advantageous to help this efficient design amendment to be used more frequently and appropriately in practice. Eight confirmatory trials were identified to have added a treatment arm, suggesting that trials can benefit from this amendment and that it can be practically feasible; however, the trials were not always able to address the key statistical considerations, often leading to uninterpretable or invalid outcomes. If the statistical concepts identified within this review are considered and addressed during the design of a trial amendment, it is possible to effectively assess a new treatment arm within an ongoing trial without compromising the original trial outcomes.
Resumo:
This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments.
Resumo:
Abstract: A new methodology was created to measure the energy consumption and related green house gas (GHG) emissions of a computer operating system (OS) across different device platforms. The methodology involved the direct power measurement of devices under different activity states. In order to include all aspects of an OS, the methodology included measurements in various OS modes, whilst uniquely, also incorporating measurements when running an array of defined software activities, so as to include OS application management features. The methodology was demonstrated on a laptop and phone that could each run multiple OSs, results confirmed that OS can significantly impact the energy consumption of devices. In particular, the new versions of the Microsoft Windows OS were tested and highlighted significant differences between the OS versions on the same hardware. The developed methodology could enable a greater awareness of energy consumption, during both the software development and software marketing processes.
Resumo:
The components of many signaling pathways have been identified and there is now a need to conduct quantitative data-rich temporal experiments for systems biology and modeling approaches to better understand pathway dynamics and regulation. Here we present a modified Western blotting method that allows the rapid and reproducible quantification and analysis of hundreds of data points per day on proteins and their phosphorylation state at individual sites. The approach is of particular use where samples show a high degree of sample-to-sample variability such as primary cells from multiple donors. We present a case study on the analysis of >800 phosphorylation data points from three phosphorylation sites in three signaling proteins over multiple time points from platelets isolated from ten donors, demonstrating the technique's potential to determine kinetic and regulatory information from limited cell numbers and to investigate signaling variation within a population. We envisage the approach being of use in the analysis of many cellular processes such as signaling pathway dynamics to identify regulatory feedback loops and the investigation of potential drug/inhibitor responses, using primary cells and tissues, to generate information about how a cell's physiological state changes over time.
Resumo:
Collocations between two satellite sensors are occasions where both sensors observe the same place at roughly the same time. We study collocations between the Microwave Humidity Sounder (MHS) on-board NOAA-18 and the Cloud Profiling Radar (CPR) on-board CloudSat. First, a simple method is presented to obtain those collocations and this method is compared with a more complicated approach found in literature. We present the statistical properties of the collocations, with particular attention to the effects of the differences in footprint size. For 2007, we find approximately two and a half million MHS measurements with CPR pixels close to their centrepoints. Most of those collocations contain at least ten CloudSat pixels and image relatively homogeneous scenes. In the second part, we present three possible applications for the collocations. Firstly, we use the collocations to validate an operational Ice Water Path (IWP) product from MHS measurements, produced by the National Environment Satellite, Data and Information System (NESDIS) in the Microwave Surface and Precipitation Products System (MSPPS). IWP values from the CloudSat CPR are found to be significantly larger than those from the MSPPS. Secondly, we compare the relation between IWP and MHS channel 5 (190.311 GHz) brightness temperature for two datasets: the collocated dataset, and an artificial dataset. We find a larger variability in the collocated dataset. Finally, we use the collocations to train an Artificial Neural Network and describe how we can use it to develop a new MHS-based IWP product. We also study the effect of adding measurements from the High Resolution Infrared Radiation Sounder (HIRS), channels 8 (11.11 μm) and 11 (8.33 μm). This shows a small improvement in the retrieval quality. The collocations described in the article are available for public use.
Resumo:
Genome-wide association studies (GWAS) have been widely used in genetic dissection of complex traits. However, common methods are all based on a fixed-SNP-effect mixed linear model (MLM) and single marker analysis, such as efficient mixed model analysis (EMMA). These methods require Bonferroni correction for multiple tests, which often is too conservative when the number of markers is extremely large. To address this concern, we proposed a random-SNP-effect MLM (RMLM) and a multi-locus RMLM (MRMLM) for GWAS. The RMLM simply treats the SNP-effect as random, but it allows a modified Bonferroni correction to be used to calculate the threshold p value for significance tests. The MRMLM is a multi-locus model including markers selected from the RMLM method with a less stringent selection criterion. Due to the multi-locus nature, no multiple test correction is needed. Simulation studies show that the MRMLM is more powerful in QTN detection and more accurate in QTN effect estimation than the RMLM, which in turn is more powerful and accurate than the EMMA. To demonstrate the new methods, we analyzed six flowering time related traits in Arabidopsis thaliana and detected more genes than previous reported using the EMMA. Therefore, the MRMLM provides an alternative for multi-locus GWAS.
Resumo:
Quantitative palaeoclimate reconstructions are widely used to evaluate climatemodel performance. Here, as part of an effort to provide such a data set for Australia, we examine the impact of analytical decisions and sampling assumptions on modern-analogue reconstructions using a continent-wide pollen data set. There is a high degree of correlation between temperature variables in the modern climate of Australia, but there is sufficient orthogonality in the variations of precipitation, summer and winter temperature and plant–available moisture to allow independent reconstructions of these four variables to be made. The method of analogue selection does not affect the reconstructions, although bootstrap resampling provides a more reliable technique for obtaining robust measures of uncertainty. The number of analogues used affects the quality of the reconstructions: the most robust reconstructions are obtained using 5 analogues. The quality of reconstructions based on post-1850 CE pollen samples differ little from those using samples from between 1450 and 1849 CE, showing that European post settlement modification of vegetation has no impact on the fidelity of the reconstructions although it substantially increases the availability of potential analogues. Reconstructions based on core top samples are more realistic than those using surface samples, but only using core top samples would substantially reduce the number of available analogues and therefore increases the uncertainty of the reconstructions. Spatial and/or temporal averaging of pollen assemblages prior to analysis negatively affects the subsequent reconstructions for some variables and increases the associated uncertainties. In addition, the quality of the reconstructions is affected by the degree of spatial smoothing of the original climate data, with the best reconstructions obtained using climate data froma 0.5° resolution grid, which corresponds to the typical size of the pollen catchment. This study provides a methodology that can be used to provide reliable palaeoclimate reconstructions for Australia, which will fill in a major gap in the data sets used to evaluate climate models.