819 resultados para Observational methodology
Resumo:
A new electronic software distribution (ESD) life cycle analysis (LCA)methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative,physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO2e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO2e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.
Resumo:
A new approach to the study of the local organization in amorphous polymer materials is presented. The method couples neutron diffraction experiments that explore the structure on the spatial scale 1–20 Å with the reverse Monte Carlo fitting procedure to predict structures that accurately represent the experimental scattering results over the whole momentum transfer range explored. Molecular mechanics and molecular dynamics techniques are also used to produce atomistic models independently from any experimental input, thereby providing a test of the viability of the reverse Monte Carlo method in generating realistic models for amorphous polymeric systems. An analysis of the obtained models in terms of single chain properties and of orientational correlations between chain segments is presented. We show the viability of the method with data from molten polyethylene. The analysis derives a model with average C-C and C-H bond lengths of 1.55 Å and 1.1 Å respectively, average backbone valence angle of 112, a torsional angle distribution characterized by a fraction of trans conformers of 0.67 and, finally, a weak interchain orientational correlation at around 4 Å.
Resumo:
A statistical methodology is proposed and tested for the analysis of extreme values of atmospheric wave activity at mid-latitudes. The adopted methods are the classical block-maximum and peak over threshold, respectively based on the generalized extreme value (GEV) distribution and the generalized Pareto distribution (GPD). Time-series of the ‘Wave Activity Index’ (WAI) and the ‘Baroclinic Activity Index’ (BAI) are computed from simulations of the General Circulation Model ECHAM4.6, which is run under perpetual January conditions. Both the GEV and the GPD analyses indicate that the extremes ofWAI and BAI areWeibull distributed, this corresponds to distributions with an upper bound. However, a remarkably large variability is found in the tails of such distributions; distinct simulations carried out under the same experimental setup provide sensibly different estimates of the 200-yr WAI return level. The consequences of this phenomenon in applications of the methodology to climate change studies are discussed. The atmospheric configurations characteristic of the maxima and minima of WAI and BAI are also examined.
Resumo:
This paper examines the changes in the length of commercial property leases over the last decade and presents an analysis of the consequent investment and occupational pricing implications for commercial property investmentsIt is argued that the pricing implications of a short lease to an investor are contingent upon the expected costs of the letting termination to the investor, the probability that the letting will be terminated and the volatility of rental values.The paper examines the key factors influencing these variables and presents a framework for incorporating their effects into pricing models.Approaches to their valuation derived from option pricing are critically assessed. It is argued that such models also tend to neglect the price effects of specific risk factors such as tenant circumstances and the terms of break clause. Specific risk factors have a significant bearing on the probability of letting termination and on the level of the resultant financial losses. The merits of a simulation methododology are examined for rental and capital valuations of short leases and properties with break clauses.It is concluded that in addition to the rigour of its internal logic, the success of any methodology is predicated upon the accuracy of the inputs.The lack of reliable data on patterns in, and incidence of, lease termination and the lack of reliable time series of historic property performance limit the efficacy of financial models.
Resumo:
Blood clotting response (BCR) resistance tests are available for a number of anticoagulant rodenticides. However, during the development of these tests many of the test parameters have been changed, making meaningful comparisons between results difficult. It was recognised that a standard methodology was urgently required for future BCR resistance tests and, accordingly, this document presents a reappraisal of published tests, and proposes a standard protocol for future use (see Appendix). The protocol can be used to provide information on the incidence and degree of resistance in a particular rodent population; to provide a simple comparison of resistance factors between active ingredients, thus giving clear information about cross-resistance for any given strain; and to provide comparisons of susceptibility or resistance between different populations. The methodology has a sound statistical basis in being based on the ED50 response, and requires many fewer animals than the resistance tests in current use. Most importantly, tests can be used to give a clear indication of the likely practical impact of the resistance on field efficacy. The present study was commissioned and funded by the Rodenticide Resistance Action Committee (RRAC) of CropLife International.
Resumo:
Maincrop potato yields in Scotland have increased by 3035 similar to t similar to ha-1 since 1960 as a result of many changes, but has changing climate contributed anything to this? The purpose of this work was to answer this question. Daily weather data for the period 19602006 were analysed for five locations covering the zones of potato growing on the east coast of Scotland (between 55.213 and 57.646 similar to N) to determine trends in temperature, rainfall and solar radiation. A physiologically based potato yield model was validated using data obtained from a long-term field trial in eastern Scotland and then employed to simulate crop development and potential yield at each of the five sites. Over the 47 similar to years, there were significant increases in annual air and 30 similar to cm soil temperatures (0.27 and 0.30 similar to K similar to decade-1, respectively), but no significant changes in annual precipitation or in the timing of the last frost in spring and the first frost of autumn. There was no evidence of any north to south gradient of warming. Simulated emergence and canopy closure became earlier at all five sites over the period with the advance being greater in the north (3.7 and 3.6 similar to days similar to decade-1, respectively) than the south (0.5 and 0.8 similar to days similar to decade-1, respectively). Potential yield increased with time, generally reflecting the increased duration of the green canopy, at average rates of 2.8 similar to t similar to ha-1 decade-1 for chitted seed (sprouted prior to planting) and 2.5 similar to t similar to ha-1 decade-1 for unchitted seed. The measured warming could contribute potential yield increases of up to 13.2 similar to t similar to ha-1 for chitted potato (range 7.119.3 similar to t similar to ha-1) and 11.5 similar to t similar to ha-1 for unchitted potato (range 7.115.5 similar to t similar to ha-1) equivalent to 3439% of the increased potential yield over the period or 2326% of the increase in actual measured yields.
Resumo:
The Solar TErrestrial RElations Observatory (STEREO) provides high cadence and high resolution images of the structure and morphology of coronal mass ejections (CMEs) in the inner heliosphere. CME directions and propagation speeds have often been estimated through the use of time-elongation maps obtained from the STEREO Heliospheric Imager (HI) data. Many of these CMEs have been identified by citizen scientists working within the SolarStormWatch project ( www.solarstormwatch.com ) as they work towards providing robust real-time identification of Earth-directed CMEs. The wide field of view of HI allows scientists to directly observe the two-dimensional (2D) structures, while the relative simplicity of time-elongation analysis means that it can be easily applied to many such events, thereby enabling a much deeper understanding of how CMEs evolve between the Sun and the Earth. For events with certain orientations, both the rear and front edges of the CME can be monitored at varying heliocentric distances (R) between the Sun and 1 AU. Here we take four example events with measurable position angle widths and identified by the citizen scientists. These events were chosen for the clarity of their structure within the HI cameras and their long track lengths in the time-elongation maps. We show a linear dependency with R for the growth of the radial width (W) and the 2D aspect ratio (χ) of these CMEs, which are measured out to ≈ 0.7 AU. We estimated the radial width from a linear best fit for the average of the four CMEs. We obtained the relationships W=0.14R+0.04 for the width and χ=2.5R+0.86 for the aspect ratio (W and R in units of AU).
Resumo:
Summer rainfall over China has experienced substantial variability on longer time scales during the last century, and the question remains whether this is due to natural, internal variability or is part of the emerging signal of anthropogenic climate change. Using the best available observations over China, the decadal variability and recent trends in summer rainfall are investigated with the emphasis on changes in the seasonal evolution and on the temporal characteristics of daily rainfall. The possible relationships with global warming are reassessed. Substantial decadal variability in summer rainfall has been confirmed during the period 1958–2008; this is not unique to this period but is also seen in the earlier decades of the twentieth century. Two dominant patterns of decadal variability have been identified that contribute substantially to the recent trend of southern flooding and northern drought. Natural decadal variability appears to dominate in general but in the cases of rainfall intensity and the frequency of rainfall days, particularly light rain days, then the dominant EOFs have a rather different character, being of one sign over most of China, and having principal components (PCs) that appear more trendlike. The increasing intensity of rainfall throughout China and the decrease in light rainfall days, particularly in the north, could at least partially be of anthropogenic origin, both global and regional, linked to increased greenhouse gases and increased aerosols.
Resumo:
Salmonella enterica serotypes Derby, Mbandaka, Montevideo, Livingstone, and Senftenberg were among the 10 most prevalent serotypes isolated from farm animals in England and Wales in 1999. These serotypes are of potential zoonotic relevance; however, there is currently no "gold standard" fingerprinting method for them. A collection of isolates representing the former serotypes and serotype Gold Coast were analyzed using plasmid profiling, pulsed-field gel electrophoresis (PFGE), and ribotyping. The success of the molecular methods in identifying DNA polymorphisms was different for each serotype. Plasmid profiling was particularly useful for serotype Derby isolates, and it also provided a good level of discrimination for serotype Senftenberg. For most serotypes, we observed a number of nontypeable plasmid-free strains, which represents a limitation of this technique. Fingerprinting of genomic DNA by ribotyping and PFGE produced a significant variation in results, depending on the serotype of the strain. Both PstI/SphI ribotyping and XbaI-PFGE provided a similar degree of strain differentiation for serotype Derby and serotype Senftenberg, only marginally lower than that achieved by plasmid profiling. Ribotyping was less sensitive than PFGE when applied to serotype Mbandaka or serotype Montevideo. Serotype Gold Coast isolates were found to be nontypeable by XbaI-PFGE, and a significant proportion of them were found to be plasmid free. A similar situation applies to a number of serotype Livingstone isolates which were nontypeable by plasmid profiling and/or PFGE. In summary, the serotype of the isolates has a considerable influence in deciding the best typing strategy; a single method cannot be relied upon for discriminating between strains, and a combination of typing methods allows further discrimination.
Resumo:
Inducing rules from very large datasets is one of the most challenging areas in data mining. Several approaches exist to scaling up classification rule induction to large datasets, namely data reduction and the parallelisation of classification rule induction algorithms. In the area of parallelisation of classification rule induction algorithms most of the work has been concentrated on the Top Down Induction of Decision Trees (TDIDT), also known as the ‘divide and conquer’ approach. However powerful alternative algorithms exist that induce modular rules. Most of these alternative algorithms follow the ‘separate and conquer’ approach of inducing rules, but very little work has been done to make the ‘separate and conquer’ approach scale better on large training data. This paper examines the potential of the recently developed blackboard based J-PMCRI methodology for parallelising modular classification rule induction algorithms that follow the ‘separate and conquer’ approach. A concrete implementation of the methodology is evaluated empirically on very large datasets.