53 resultados para R-Statistical computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft’s cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

n this study, the authors discuss the effective usage of technology to solve the problem of deciding on journey start times for recurrent traffic conditions. The developed algorithm guides the vehicles to travel on more reliable routes that are not easily prone to congestion or travel delays, ensures that the start time is as late as possible to avoid the traveller waiting too long at their destination and attempts to minimise the travel time. Experiments show that in order to be more certain of reaching their destination on time, a traveller has to leave early and correspondingly arrive early, resulting in a large waiting time. The application developed here asks the user to set this certainty factor as per the task in hand, and computes the best start time and route.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional climate downscaling has arrived at an important juncture. Some in the research community favour continued refinement and evaluation of downscaling techniques within a broader framework of uncertainty characterisation and reduction. Others are calling for smarter use of downscaling tools, accepting that conventional, scenario-led strategies for adaptation planning have limited utility in practice. This paper sets out the rationale and new functionality of the Decision Centric (DC) version of the Statistical DownScaling Model (SDSM-DC). This tool enables synthesis of plausible daily weather series, exotic variables (such as tidal surge), and climate change scenarios guided, not determined, by climate model output. Two worked examples are presented. The first shows how SDSM-DC can be used to reconstruct and in-fill missing records based on calibrated predictor-predictand relationships. Daily temperature and precipitation series from sites in Africa, Asia and North America are deliberately degraded to show that SDSM-DC can reconstitute lost data. The second demonstrates the application of the new scenario generator for stress testing a specific adaptation decision. SDSM-DC is used to generate daily precipitation scenarios to simulate winter flooding in the Boyne catchment, Ireland. This sensitivity analysis reveals the conditions under which existing precautionary allowances for climate change might be insufficient. We conclude by discussing the wider implications of the proposed approach and research opportunities presented by the new tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The chapter examines how far medieval economic crises can be identified by analysing the residuals from a simultaneous equation model of the medieval English economy. High inflation, falls in gross domestic product and large intermittent changes in wage rates are all considered as potential indicators of crisis. Potential causal factors include bad harvests, wars and political instability. The chapter suggests that crises arose when a combination of different problems overwhelmed the capacity of government to address them. It may therefore be a mistake to look for a single cause of any crisis. The coincidence of separate problems is a more plausible explanation of many crises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the first multi-event study of the spatial and temporal structuring of the aurora to provide statistical evidence of the near-Earth plasma instability which causes the substorm onset arc. Using data from ground-based auroral imagers, we study repeatable signatures of along-arc auroral beads, which are thought to represent the ionospheric projection of magnetospheric instability in the near-Earth plasma sheet. We show that the growth and spatial scales of these wave-like fluctuations are similar across multiple events, indicating that each sudden auroral brightening has a common explanation. We find statistically that growth rates for auroral beads peak at low wavenumber with the most unstable spatial scales mapping to an azimuthal wavelength λ≈1700 − 2500 km in the equatorial magnetosphere at around 9-12 RE. We compare growth rates and spatial scales with a range of theoretical predictions of magnetotail instabilities, including the cross-field current instability and the shear-flow ballooning instability. We conclude that, although the cross-field current instability can generate similar magnitude of growth rates, the range of unstable wavenumbers indicates that the shear-flow ballooning instability is the most likely explanation for our observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models for which the likelihood function can be evaluated only up to a parameter-dependent unknown normalizing constant, such as Markov random field models, are used widely in computer science, statistical physics, spatial statistics, and network analysis. However, Bayesian analysis of these models using standard Monte Carlo methods is not possible due to the intractability of their likelihood functions. Several methods that permit exact, or close to exact, simulation from the posterior distribution have recently been developed. However, estimating the evidence and Bayes’ factors for these models remains challenging in general. This paper describes new random weight importance sampling and sequential Monte Carlo methods for estimating BFs that use simulation to circumvent the evaluation of the intractable likelihood, and compares them to existing methods. In some cases we observe an advantage in the use of biased weight estimates. An initial investigation into the theoretical and empirical properties of this class of methods is presented. Some support for the use of biased estimates is presented, but we advocate caution in the use of such estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We extend the method of Cassels for computing the Cassels-Tate pairing on the 2-Selmer group of an elliptic curve, to the case of 3-Selmer groups. This requires significant modifications to both the local and global parts of the calculation. Our method is practical in sufficiently small examples, and can be used to improve the upper bound for the rank of an elliptic curve obtained by 3-descent.