992 resultados para Calculation methodology
Resumo:
There is an on-going debate on the environmental effects of genetically modified crops to which this paper aims to contribute. First, data on environmental impacts of genetically modified (GM) and conventional crops are collected from peer-reviewed journals, and secondly an analysis is conducted in order to examine which crop type is less harmful for the environment. Published data on environmental impacts are measured using an array of indicators, and their analysis requires their normalisation and aggregation. Taking advantage of composite indicators literature, this paper builds composite indicators to measure the impact of GM and conventional crops in three dimensions: (1) non-target key species richness, (2) pesticide use, and (3) aggregated environmental impact. The comparison between the three composite indicators for both crop types allows us to establish not only a ranking to elucidate which crop is more convenient for the environment but the probability that one crop type outperforms the other from an environmental perspective. Results show that GM crops tend to cause lower environmental impacts than conventional crops for the analysed indicators.
Resumo:
The sustainable intelligent building is a building that has the best combination of environmental, social, economic and technical values. And its sustainability assessment is related with system engineering methods and multi-criteria decision-making. Therefore firstly, the wireless monitoring system of sustainable parameters for intelligent buildings is achieved; secondly, the indicators and key issues based on the “whole life circle” for sustainability of intelligent buildings are researched; thirdly, the sustainable assessment model identified on the structure entropy and fuzzy analytic hierarchy process is proposed.
Resumo:
Incorporating an emerging therapy as a new randomisation arm in a clinical trial that is open to recruitment would be desirable to researchers, regulators and patients to ensure that the trial remains current, new treatments are evaluated as quickly as possible, and the time and cost for determining optimal therapies is minimised. It may take many years to run a clinical trial from concept to reporting within a rapidly changing drug development environment; hence, in order for trials to be most useful to inform policy and practice, it is advantageous for them to be able to adapt to emerging therapeutic developments. This paper reports a comprehensive literature review on methodologies for, and practical examples of, amending an ongoing clinical trial by adding a new treatment arm. Relevant methodological literature describing statistical considerations required when making this specific type of amendment is identified, and the key statistical concepts when planning the addition of a new treatment arm are extracted, assessed and summarised. For completeness, this includes an assessment of statistical recommendations within general adaptive design guidance documents. Examples of confirmatory ongoing trials designed within the frequentist framework that have added an arm in practice are reported; and the details of the amendment are reviewed. An assessment is made as to how well the relevant statistical considerations were addressed in practice, and the related implications. The literature review confirmed that there is currently no clear methodological guidance on this topic, but that guidance would be advantageous to help this efficient design amendment to be used more frequently and appropriately in practice. Eight confirmatory trials were identified to have added a treatment arm, suggesting that trials can benefit from this amendment and that it can be practically feasible; however, the trials were not always able to address the key statistical considerations, often leading to uninterpretable or invalid outcomes. If the statistical concepts identified within this review are considered and addressed during the design of a trial amendment, it is possible to effectively assess a new treatment arm within an ongoing trial without compromising the original trial outcomes.
Resumo:
COCO-2 is a model for assessing the potential economic costs likely to arise off-site following an accident at a nuclear reactor. COCO-2 builds on work presented in the model COCO-1 developed in 1991 by considering economic effects in more detail, and by including more sources of loss. Of particular note are: the consideration of the directly affected local economy, indirect losses that stem from the directly affected businesses, losses due to changes in tourism consumption, integration with the large body of work on recovery after an accident and a more systematic approach to health costs. The work, where possible, is based on official data sources for reasons of traceability, maintenance and ease of future development. This report describes the methodology and discusses the results of an example calculation. Guidance on how the base economic data can be updated in the future is also provided.
Resumo:
This article has two main objectives. First, we offer an introduction to the subfield of generative third language (L3) acquisition. Concerned primarily with modeling initial stages transfer of morphosyntax, one goal of this program is to show how initial stages L3 data make significant contributions toward a better understanding of how the mind represents language and how (cognitive) economy constrains acquisition processes more generally. Our second objective is to argue for and demonstrate how this subfield will benefit from a neuro/psycholinguistic methodological approach, such as event-related potential experiments, to complement the claims currently made on the basis of exclusively behavioral experiments.
Resumo:
Abstract: A new methodology was created to measure the energy consumption and related green house gas (GHG) emissions of a computer operating system (OS) across different device platforms. The methodology involved the direct power measurement of devices under different activity states. In order to include all aspects of an OS, the methodology included measurements in various OS modes, whilst uniquely, also incorporating measurements when running an array of defined software activities, so as to include OS application management features. The methodology was demonstrated on a laptop and phone that could each run multiple OSs, results confirmed that OS can significantly impact the energy consumption of devices. In particular, the new versions of the Microsoft Windows OS were tested and highlighted significant differences between the OS versions on the same hardware. The developed methodology could enable a greater awareness of energy consumption, during both the software development and software marketing processes.
Resumo:
The components of many signaling pathways have been identified and there is now a need to conduct quantitative data-rich temporal experiments for systems biology and modeling approaches to better understand pathway dynamics and regulation. Here we present a modified Western blotting method that allows the rapid and reproducible quantification and analysis of hundreds of data points per day on proteins and their phosphorylation state at individual sites. The approach is of particular use where samples show a high degree of sample-to-sample variability such as primary cells from multiple donors. We present a case study on the analysis of >800 phosphorylation data points from three phosphorylation sites in three signaling proteins over multiple time points from platelets isolated from ten donors, demonstrating the technique's potential to determine kinetic and regulatory information from limited cell numbers and to investigate signaling variation within a population. We envisage the approach being of use in the analysis of many cellular processes such as signaling pathway dynamics to identify regulatory feedback loops and the investigation of potential drug/inhibitor responses, using primary cells and tissues, to generate information about how a cell's physiological state changes over time.
Resumo:
Collocations between two satellite sensors are occasions where both sensors observe the same place at roughly the same time. We study collocations between the Microwave Humidity Sounder (MHS) on-board NOAA-18 and the Cloud Profiling Radar (CPR) on-board CloudSat. First, a simple method is presented to obtain those collocations and this method is compared with a more complicated approach found in literature. We present the statistical properties of the collocations, with particular attention to the effects of the differences in footprint size. For 2007, we find approximately two and a half million MHS measurements with CPR pixels close to their centrepoints. Most of those collocations contain at least ten CloudSat pixels and image relatively homogeneous scenes. In the second part, we present three possible applications for the collocations. Firstly, we use the collocations to validate an operational Ice Water Path (IWP) product from MHS measurements, produced by the National Environment Satellite, Data and Information System (NESDIS) in the Microwave Surface and Precipitation Products System (MSPPS). IWP values from the CloudSat CPR are found to be significantly larger than those from the MSPPS. Secondly, we compare the relation between IWP and MHS channel 5 (190.311 GHz) brightness temperature for two datasets: the collocated dataset, and an artificial dataset. We find a larger variability in the collocated dataset. Finally, we use the collocations to train an Artificial Neural Network and describe how we can use it to develop a new MHS-based IWP product. We also study the effect of adding measurements from the High Resolution Infrared Radiation Sounder (HIRS), channels 8 (11.11 μm) and 11 (8.33 μm). This shows a small improvement in the retrieval quality. The collocations described in the article are available for public use.
Resumo:
Genome-wide association studies (GWAS) have been widely used in genetic dissection of complex traits. However, common methods are all based on a fixed-SNP-effect mixed linear model (MLM) and single marker analysis, such as efficient mixed model analysis (EMMA). These methods require Bonferroni correction for multiple tests, which often is too conservative when the number of markers is extremely large. To address this concern, we proposed a random-SNP-effect MLM (RMLM) and a multi-locus RMLM (MRMLM) for GWAS. The RMLM simply treats the SNP-effect as random, but it allows a modified Bonferroni correction to be used to calculate the threshold p value for significance tests. The MRMLM is a multi-locus model including markers selected from the RMLM method with a less stringent selection criterion. Due to the multi-locus nature, no multiple test correction is needed. Simulation studies show that the MRMLM is more powerful in QTN detection and more accurate in QTN effect estimation than the RMLM, which in turn is more powerful and accurate than the EMMA. To demonstrate the new methods, we analyzed six flowering time related traits in Arabidopsis thaliana and detected more genes than previous reported using the EMMA. Therefore, the MRMLM provides an alternative for multi-locus GWAS.
Resumo:
Quantitative palaeoclimate reconstructions are widely used to evaluate climatemodel performance. Here, as part of an effort to provide such a data set for Australia, we examine the impact of analytical decisions and sampling assumptions on modern-analogue reconstructions using a continent-wide pollen data set. There is a high degree of correlation between temperature variables in the modern climate of Australia, but there is sufficient orthogonality in the variations of precipitation, summer and winter temperature and plant–available moisture to allow independent reconstructions of these four variables to be made. The method of analogue selection does not affect the reconstructions, although bootstrap resampling provides a more reliable technique for obtaining robust measures of uncertainty. The number of analogues used affects the quality of the reconstructions: the most robust reconstructions are obtained using 5 analogues. The quality of reconstructions based on post-1850 CE pollen samples differ little from those using samples from between 1450 and 1849 CE, showing that European post settlement modification of vegetation has no impact on the fidelity of the reconstructions although it substantially increases the availability of potential analogues. Reconstructions based on core top samples are more realistic than those using surface samples, but only using core top samples would substantially reduce the number of available analogues and therefore increases the uncertainty of the reconstructions. Spatial and/or temporal averaging of pollen assemblages prior to analysis negatively affects the subsequent reconstructions for some variables and increases the associated uncertainties. In addition, the quality of the reconstructions is affected by the degree of spatial smoothing of the original climate data, with the best reconstructions obtained using climate data froma 0.5° resolution grid, which corresponds to the typical size of the pollen catchment. This study provides a methodology that can be used to provide reliable palaeoclimate reconstructions for Australia, which will fill in a major gap in the data sets used to evaluate climate models.
Resumo:
This paper argues that the intellectual contribution of Alan Rugman reflects his distinctive research methodology. Alan Rugman trained as an economist, and relied heavily on economic principles throughout his work. He believed that one good theory was sufficient for IB studies, and that theory, he maintained, was internalisation theory. He rejected theoretical pluralism, and believed that IB suffered from a surfeit of theories. Alan was a positivist. The test of a good theory was that it led to clear predictions which were corroborated by empirical evidence. Many IB theories, Alan believed, were weak; their proliferation sowed confusion and they needed to be refuted. Alan’s interpretation of internalisation was, however, unconventional in some respects. He played down the trade-offs presented in Coase’s original work, and substituted heuristics in their place. Instead of analysing internalisation as a context-specific choice between alternative contractual arrangements, he presented it as a strategic imperative for firms possessing strong knowledge advantages. His heuristics did not apply to every possible case, but in Alan’s view they applied in the great majority of cases and were therefore a basis for management action.
Resumo:
Solvent-free desymmetrisation of a meso-dialdehyde with chiral alcohols, led to preparation of 4-silyloxy-6-alkyloxytetrahydro-2H-pyran-2-one derivatives with a 96% de. This methodology, which yields the corresponding methyl nor-mevaldates with 99% ee, has been applied to the enantioselective synthesis of the (-)-(R) and (+)-(S) nor-mevalonic acid lactones.
Resumo:
A reorientation is needed in methodological debate about the role of intuitions in philosophy. Methodological debate has lost sight of the reason why it makes sense to focus on questions about intuitions when thinking about the methods or epistemology of philosophy. The problem is an approach to methodology which gives a near exclusive focus to questions about some evidential role that intuitions may or may not play in philosophers' arguments. A new approach is needed. Approaching methodological questions about the role of intuitions in philosophy with an abductive model of philosophical enquiry in mind will help ensure the debate doesn't lose sight of what motivates the debate.