694 resultados para sticky-price DGSE models
Resumo:
Correlations between oil and agricultural commodities have varied over previous decades, impacted by renewable fuels policy and turbulent economic conditions. We estimate smooth transition conditional correlation models for 12 agricultural commodities and WTI crude oil. While a structural change in correlations occurred concurrently with the introduction of biofuel policy, oil and food price levels are also key influences. High correlation between biofuel feedstocks and oil is more likely to occur when food and oil price levels are high. Correlation with oil returns is strong for biofuel feedstocks, unlike with other agricultural futures, suggesting limited contagion from energy to food markets.
Resumo:
Amid tough trading conditions and intense competition, Coles has fired the latest salvo in its ongoing supermarket war with Woolworths, announcing it will reduce the price of some fruit and vegetables by 50%. The move is the latest in a battle between the supermarket giants to wrest market share and follows previous cuts to staples such as milk and bread, beer and chicken. However, Australia’s peak industry body of vegetable growers, Ausveg, is concerned about the impact the price decision will have on growers' livelihoods.
Resumo:
On January 26, 2011, grocery retailer Coles fired the first salvo in what would soon be dubbed the “supermarket price wars” by reducing the price of its own-brand milk to A$1 per litre. Woolworths immediately responded. In the three years since, grocery prices have been tumbling, with 85 cent bread being the latest “sacrificial lamb”. This period of intense competition has brought about not just lower grocery prices, but a senate enquiry, and increasing media and analyst interest.
Resumo:
Having the ability to work with complex models can be highly beneficial, but the computational cost of doing so is often large. Complex models often have intractable likelihoods, so methods that directly use the likelihood function are infeasible. In these situations, the benefits of working with likelihood-free methods become apparent. Likelihood-free methods, such as parametric Bayesian indirect likelihood that uses the likelihood of an alternative parametric auxiliary model, have been explored throughout the literature as a good alternative when the model of interest is complex. One of these methods is called the synthetic likelihood (SL), which assumes a multivariate normal approximation to the likelihood of a summary statistic of interest. This paper explores the accuracy and computational efficiency of the Bayesian version of the synthetic likelihood (BSL) approach in comparison to a competitor known as approximate Bayesian computation (ABC) and its sensitivity to its tuning parameters and assumptions. We relate BSL to pseudo-marginal methods and propose to use an alternative SL that uses an unbiased estimator of the exact working normal likelihood when the summary statistic has a multivariate normal distribution. Several applications of varying complexity are considered to illustrate the findings of this paper.
Resumo:
Improving the availability, accessibility and affordability of healthy food equitably is fundamental to improving nutrition and health. While theoretical models abound, in real world complex systems rarely are there opportunities to address leverage points systematically to improve food supply. This presentation describes efforts over the last 30 years to do just that by remote Australian Aboriginal communities, where a single community store is usually the major dietary source. Areas addressed include store governance and infrastructure, wholesale supply, transport and pricing policies including cross-subsidization. However, while there have been dramatic improvements in the availability, quality and price of fruit, vegetables and most other healthy foods over this time, the proportion of communities' energy intake from energy-dense nutrient-poor foods and drinks has increased. One cause may be the disproportionate increase in supply of unhealthy choices in terms of variety and shelf-space, consistent with changes in the food supply in broader Australia. The impact of changing social and environmental factors, food preferences and price elasticity will also be explored briefly. Clearly much more needs to be done to reduce the high prevalence of diet-related chronic disease in some vulnerable groups. In particular, efforts to continually improve the availability and affordability of healthy food also need to address the predominance of unhealthy choices in the food supply.
Resumo:
This presentation outlines recent achievements in development of tools, protocols and methods to monitoring and benchmark food prices and affordability globally under International Network for Food and Obesity/non-communicable diseases Research, Monitoring and Action Support(INFORMAS)
Resumo:
In a very recent study [1] the Renormalisation Group (RNG) turbulence model was used to obtain flow predictions in a strongly swirling quarl burner, and was found to perform well in predicting certain features that are not well captured using less sophisticated models of turbulence. The implication is that the RNG approach should provide an economical and reliable tool for the prediction of swirling flows in combustor and furnace geometries commonly encountered in technological applications. To test this hypothesis the present work considers flow in a model furnace for which experimental data is available [2]. The essential features of the flow which differentiate it from the previous study [1] are that the annular air jet entry is relatively narrow and the base wall of the cylindrical furnace is at 90 degrees to the inlet pipe. For swirl numbers of order 1 the resulting flow is highly complex with significant inner and outer recirculation regions. The RNG and standard k-epsilon models are used to model the flow for both swirling and non-swirling entry jets and the results compared with experimental data [2]. Near wall viscous effects are accounted for in both models via the standard wall function formulation [3]. For the RNG model, additional computations with grid placement extending well inside the near wall viscous-affected sublayer are performed in order to assess the low Reynolds number capabilities of the model.
Resumo:
In this work we numerically model isothermal turbulent swirling flow in a cylindrical burner. Three versions of the RNG k-epsilon model are assessed against performance of the standard k-epsilon model. Sensitivity of numerical predictions to grid refinement, differing convective differencing schemes and choice of (unknown) inlet dissipation rate, were closely scrutinised to ensure accuracy. Particular attention is paid to modelling the inlet conditions to within the range of uncertainty of the experimental data, as model predictions proved to be significantly sensitive to relatively small changes in upstream flow conditions. We also examine the characteristics of the swirl--induced recirculation zone predicted by the models over an extended range of inlet conditions. Our main findings are: - (i) the standard k-epsilon model performed best compared with experiment; - (ii) no one inlet specification can simultaneously optimize the performance of the models considered; - (iii) the RNG models predict both single-cell and double-cell IRZ characteristics, the latter both with and without additional internal stagnation points. The first finding indicates that the examined RNG modifications to the standard k-e model do not result in an improved eddy viscosity based model for the prediction of swirl flows. The second finding suggests that tuning established models for optimal performance in swirl flows a priori is not straightforward. The third finding indicates that the RNG based models exhibit a greater variety of structural behaviour, despite being of the same level of complexity as the standard k-e model. The plausibility of the predicted IRZ features are discussed in terms of known vortex breakdown phenomena.
Resumo:
Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.
Resumo:
Despite positive testing in animal studies, more than 80% of novel drug candidates fail to proof their efficacy when tested in humans. This is primarily due to the use of preclinical models that are not able to recapitulate the physiological or pathological processes in humans. Hence, one of the key challenges in the field of translational medicine is to “make the model organism mouse more human.” To get answers to questions that would be prognostic of outcomes in human medicine, the mouse's genome can be altered in order to create a more permissive host that allows the engraftment of human cell systems. It has been shown in the past that these strategies can improve our understanding of tumor immunology. However, the translational benefits of these platforms have still to be proven. In the 21st century, several research groups and consortia around the world take up the challenge to improve our understanding of how to humanize the animal's genetic code, its cells and, based on tissue engineering principles, its extracellular microenvironment, its tissues, or entire organs with the ultimate goal to foster the translation of new therapeutic strategies from bench to bedside. This article provides an overview of the state of the art of humanized models of tumor immunology and highlights future developments in the field such as the application of tissue engineering and regenerative medicine strategies to further enhance humanized murine model systems.
Multi-GNSS precise point positioning with raw single-frequency and dual-frequency measurement models
Resumo:
The emergence of multiple satellite navigation systems, including BDS, Galileo, modernized GPS, and GLONASS, brings great opportunities and challenges for precise point positioning (PPP). We study the contributions of various GNSS combinations to PPP performance based on undifferenced or raw observations, in which the signal delays and ionospheric delays must be considered. A priori ionospheric knowledge, such as regional or global corrections, strengthens the estimation of ionospheric delay parameters. The undifferenced models are generally more suitable for single-, dual-, or multi-frequency data processing for single or combined GNSS constellations. Another advantage over ionospheric-free PPP models is that undifferenced models avoid noise amplification by linear combinations. Extensive performance evaluations are conducted with multi-GNSS data sets collected from 105 MGEX stations in July 2014. Dual-frequency PPP results from each single constellation show that the convergence time of undifferenced PPP solution is usually shorter than that of ionospheric-free PPP solutions, while the positioning accuracy of undifferenced PPP shows more improvement for the GLONASS system. In addition, the GLONASS undifferenced PPP results demonstrate performance advantages in high latitude areas, while this impact is less obvious in the GPS/GLONASS combined configuration. The results have also indicated that the BDS GEO satellites have negative impacts on the undifferenced PPP performance given the current “poor” orbit and clock knowledge of GEO satellites. More generally, the multi-GNSS undifferenced PPP results have shown improvements in the convergence time by more than 60 % in both the single- and dual-frequency PPP results, while the positioning accuracy after convergence indicates no significant improvements for the dual-frequency PPP solutions, but an improvement of about 25 % on average for the single-frequency PPP solutions.
Genetic analysis of structural brain connectivity using DICCCOL models of diffusion MRI in 522 twins
Resumo:
Genetic and environmental factors affect white matter connectivity in the normal brain, and they also influence diseases in which brain connectivity is altered. Little is known about genetic influences on brain connectivity, despite wide variations in the brain's neural pathways. Here we applied the 'DICCCOL' framework to analyze structural connectivity, in 261 twin pairs (522 participants, mean age: 21.8 y ± 2.7SD). We encoded connectivity patterns by projecting the white matter (WM) bundles of all 'DICCCOLs' as a tracemap (TM). Next we fitted an A/C/E structural equation model to estimate additive genetic (A), common environmental (C), and unique environmental/error (E) components of the observed variations in brain connectivity. We found 44 'heritable DICCCOLs' whose connectivity was genetically influenced (α2>1%); half of them showed significant heritability (α2>20%). Our analysis of genetic influences on WM structural connectivity suggests high heritability for some WM projection patterns, yielding new targets for genome-wide association studies.
Resumo:
In the world today there are many ways in which we measure, count and determine whether something is worth the effort or not. In Australia and many other countries, new government legislation is requiring government-funded entities to become more transparent in their practice and to develop a more cohesive narrative about the worth, or impact, for the betterment of society. This places the executives of such entities in a position of needing evaluative thinking and practice to guide how they may build the narrative that documents and demonstrates this type of impact. In thinking about where to start, executives, project and program managers may consider this workshop as a professional development opportunity to explore both the intended and unintended consequences of performance models as tools of evaluation. This workshop will offer participants an opportunity to unpack the place of performance models as an evaluative tool through the following: · What shape does an ethical, sound and valid performance measure for an organization or personnel take? · What role does cultural specificity play in the design and development of a performance model for an organization or for personnel? · How are stakeholders able to identify risk during the design and development of such models? · When and where will dissemination strategies be required? · And so what? How can you determine that your performance model implementation has made a difference now or in the future?
Resumo:
Improved forecasting of urban rail patronage is essential for effective policy development and efficient planning for new rail infrastructure. Past modelling and forecasting of urban rail patronage has been based on legacy modelling approaches and often conducted at the general level of public transport demand, rather than being specific to urban rail. This project canvassed current Australian practice and international best practice to develop and estimate time series and cross-sectional models of rail patronage for Australian mainland state capital cities. This involved the implementation of a large online survey of rail riders and non-riders for each of the state capital cities, thereby resulting in a comprehensive database of respondent socio-economic profiles, travel experience, attitudes to rail and other modes of travel, together with stated preference responses to a wide range of urban travel scenarios. Estimation of the models provided a demonstration of their ability to provide information on the major influences on the urban rail travel decision. Rail fares, congestion and rail service supply all have a strong influence on rail patronage, while a number of less significant factors such as fuel price and access to a motor vehicle are also influential. Of note, too, is the relative homogeneity of rail user profiles across the state capitals. Rail users tended to have higher incomes and education levels. They are also younger and more likely to be in full-time employment than non-rail users. The project analysis reported here represents only a small proportion of what could be accomplished utilising the survey database. More comprehensive investigation was beyond the scope of the project and has been left for future work.
Resumo:
This paper addresses the problem of discovering business process models from event logs. Existing approaches to this problem strike various tradeoffs between accuracy and understandability of the discovered models. With respect to the second criterion, empirical studies have shown that block-structured process models are generally more understandable and less error-prone than unstructured ones. Accordingly, several automated process discovery methods generate block-structured models by construction. These approaches however intertwine the concern of producing accurate models with that of ensuring their structuredness, sometimes sacrificing the former to ensure the latter. In this paper we propose an alternative approach that separates these two concerns. Instead of directly discovering a structured process model, we first apply a well-known heuristic technique that discovers more accurate but sometimes unstructured (and even unsound) process models, and then transform the resulting model into a structured one. An experimental evaluation shows that our “discover and structure” approach outperforms traditional “discover structured” approaches with respect to a range of accuracy and complexity measures.