982 resultados para Applied Computing
Resumo:
A technique for subtyping Camplobacter jejuni isolates has been developed by using the restriction fragment length polymorphism (Rnp) of polymerase chain reaction (PCR) products of the fluA and flaB genes. The technique was validated by using strains representing 28 serotypes of C jejuni and it may also be applied to C coli. From these strains 12 distinct RFLP profiles were observed but there was no direct relationship between the RFLP profile and the serotype. One hundred and thirty-five campylobacter isolates from 15 geographically distinct broiler flocks were investigated. All the isolates could be subtyped by using the RFLP method. Isolates from most of the flocks had a single RFLP profile despite data indicating that several serotypes were involved. Although it is possible that further restriction analysis may have demonstrated profile variations in these strains, it is more likely that antigenic variation can occur within genotypically related campylobacters. As a result, serotyping may give conflicting information for veterinary epidemiological purposes. This RFLP typing scheme appears to provide a suitable tool for the investigation of the sources and routes of transmission of campylobacters in chickens.
Resumo:
Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.
Resumo:
This paper examines the implications of using marketing margins in applied commodity price analysis. The marketing-margin concept has a long and distinguished history, but it has caused considerable controversy. This is particularly the case in the context of analyzing the distribution of research gains in multi-stage production systems. We derive optimal tax schemes for raising revenues to finance research and promotion in a downstream market, derive the rules for efficient allocation of the funds, and compare the rules with an without the marketing-margin assumption. Applying the methodology to quarterly time series on the Australian beef-cattle sector and, with several caveats, we conclude that, during the period 1978:2 - 1988:4, the Australian Meat and Livestock Corporation optimally allocated research resources.
Resumo:
Abstract This study presents a model intercomparison of four regional climate models (RCMs) and one variable resolution atmospheric general circulation model (AGCM) applied over Europe with special focus on the hydrological cycle and the surface energy budget. The models simulated the 15 years from 1979 to 1993 by using quasi-observed boundary conditions derived from ECMWF re-analyses (ERA). The model intercomparison focuses on two large atchments representing two different climate conditions covering two areas of major research interest within Europe. The first is the Danube catchment which represents a continental climate dominated by advection from the surrounding land areas. It is used to analyse the common model error of a too dry and too warm simulation of the summertime climate of southeastern Europe. This summer warming and drying problem is seen in many RCMs, and to a less extent in GCMs. The second area is the Baltic Sea catchment which represents maritime climate dominated by advection from the ocean and from the Baltic Sea. This catchment is a research area of many studies within Europe and also covered by the BALTEX program. The observed data used are monthly mean surface air temperature, precipitation and river discharge. For all models, these are used to estimate mean monthly biases of all components of the hydrological cycle over land. In addition, the mean monthly deviations of the surface energy fluxes from ERA data are computed. Atmospheric moisture fluxes from ERA are compared with those of one model to provide an independent estimate of the convergence bias derived from the observed data. These help to add weight to some of the inferred estimates and explain some of the discrepancies between them. An evaluation of these biases and deviations suggests possible sources of error in each of the models. For the Danube catchment, systematic errors in the dynamics cause the prominent summer drying problem for three of the RCMs, while for the fourth RCM this is related to deficiencies in the land surface parametrization. The AGCM does not show this drying problem. For the Baltic Sea catchment, all models similarily overestimate the precipitation throughout the year except during the summer. This model deficit is probably caused by the internal model parametrizations, such as the large-scale condensation and the convection schemes.
Intercomparison of water and energy budgets simulated by regional climate models applied over Europe
Resumo:
Real estate depreciation continues to be a critical issue for investors and the appraisal profession in the UK in the 1990s. Depreciation-sensitive cash flow models have been developed, but there is a real need to develop further empirical methodologies to determine rental depreciation rates for input into these models. Although building quality has been found to be an important explanatory variable in depreciation it is very difficult to incorporate it into such models or to analyse it retrospectively. It is essential to examine previous depreciation research from real estate and economics in the USA and UK to understand the issues in constructing a valid and pragmatic way of calculating rental depreciation. Distinguishing between 'depreciation' and 'obsolescence' is important, and the pattern of depreciation in any study can be influenced by such factors as the type (longitudinal or crosssectional) and timing of the study, and the market state. Longitudinal studies can analyse change more directly than cross-sectional studies. Any methodology for calculating rental depreciation rate should be formulated in the context of such issues as 'censored sample bias', 'lemons' and 'filtering', which have been highlighted in key US literature from the field of economic depreciation. Property depreciation studies in the UK have tended to overlook this literature, however. Although data limitations and constraints reduce the ability of empirical property depreciation work in the UK to consider these issues fully, 'averaging' techniques and ordinary least squares (OLS) regression can both provide a consistent way of calculating rental depreciation rates within a 'cohort' framework.
Resumo:
The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions.
Resumo:
Climate is one of the main factors controlling winegrape production. Bioclimatic indices describing the suitability of a particular region for wine production are a widely used zoning tool. Seven suitable bioclimatic indices characterize regions in Europe with different viticultural suitability, and their possible geographical shifts under future climate conditions are addressed using regional climate model simulations. The indices are calculated from climatic variables (daily values of temperature and precipitation) obtained from transient ensemble simulations with the regional model COSMO-CLM. Index maps for recent decades (1960–2000) and for the 21st century (following the IPCC-SRES B1 and A1B scenarios) are compared. Results show that climate change is projected to have a significant effect on European viticultural geography. Detrimental impacts on winegrowing are predicted in southern Europe, mainly due to increased dryness and cumulative thermal effects during the growing season. These changes represent an important constraint to grapevine growth and development, making adaptation strategies crucial, such as changing varieties or introducing water supply by irrigation. Conversely, in western and central Europe, projected future changes will benefit not only wine quality, but might also demarcate new potential areas for viticulture, despite some likely threats associated with diseases. Regardless of the inherent uncertainties, this approach provides valuable information for implementing proper and diverse adaptation measures in different European regions.
Resumo:
Hybrid multiprocessor architectures which combine re-configurable computing and multiprocessors on a chip are being proposed to transcend the performance of standard multi-core parallel systems. Both fine-grained and coarse-grained parallel algorithm implementations are feasible in such hybrid frameworks. A compositional strategy for designing fine-grained multi-phase regular processor arrays to target hybrid architectures is presented in this paper. The method is based on deriving component designs using classical regular array techniques and composing the components into a unified global design. Effective designs with phase-changes and data routing at run-time are characteristics of these designs. In order to describe the data transfer between phases, the concept of communication domain is introduced so that the producer–consumer relationship arising from multi-phase computation can be treated in a unified way as a data routing phase. This technique is applied to derive new designs of multi-phase regular arrays with different dataflow between phases of computation.
Resumo:
Many applications, such as intermittent data assimilation, lead to a recursive application of Bayesian inference within a Monte Carlo context. Popular data assimilation algorithms include sequential Monte Carlo methods and ensemble Kalman filters (EnKFs). These methods differ in the way Bayesian inference is implemented. Sequential Monte Carlo methods rely on importance sampling combined with a resampling step, while EnKFs utilize a linear transformation of Monte Carlo samples based on the classic Kalman filter. While EnKFs have proven to be quite robust even for small ensemble sizes, they are not consistent since their derivation relies on a linear regression ansatz. In this paper, we propose another transform method, which does not rely on any a priori assumptions on the underlying prior and posterior distributions. The new method is based on solving an optimal transportation problem for discrete random variables. © 2013, Society for Industrial and Applied Mathematics
Resumo:
We present a Galerkin method with piecewise polynomial continuous elements for fully nonlinear elliptic equations. A key tool is the discretization proposed in Lakkis and Pryer, 2011, allowing us to work directly on the strong form of a linear PDE. An added benefit to making use of this discretization method is that a recovered (finite element) Hessian is a byproduct of the solution process. We build on the linear method and ultimately construct two different methodologies for the solution of second order fully nonlinear PDEs. Benchmark numerical results illustrate the convergence properties of the scheme for some test problems as well as the Monge–Amp`ere equation and the Pucci equation.