51 resultados para Predicting model
Resumo:
In this paper we focus on the one year ahead prediction of the electricity peak-demand daily trajectory during the winter season in Central England and Wales. We define a Bayesian hierarchical model for predicting the winter trajectories and present results based on the past observed weather. Thanks to the flexibility of the Bayesian approach, we are able to produce the marginal posterior distributions of all the predictands of interest. This is a fundamental progress with respect to the classical methods. The results are encouraging in both skill and representation of uncertainty. Further extensions are straightforward at least in principle. The main two of those consist in conditioning the weather generator model with respect to additional information like the knowledge of the first part of the winter and/or the seasonal weather forecast. Copyright (C) 2006 John Wiley & Sons, Ltd.
Resumo:
An important element of the developing field of proteomics is to understand protein-protein interactions and other functional links amongst genes. Across-species correlation methods for detecting functional links work on the premise that functionally linked proteins will tend to show a common pattern of presence and absence across a range of genomes. We describe a maximum likelihood statistical model for predicting functional gene linkages. The method detects independent instances of the correlated gain or loss of pairs of proteins on phylogenetic trees, reducing the high rates of false positives observed in conventional across-species methods that do not explicitly incorporate a phylogeny. We show, in a dataset of 10,551 protein pairs, that the phylogenetic method improves by up to 35% on across-species analyses at identifying known functionally linked proteins. The method shows that protein pairs with at least two to three correlated events of gain or loss are almost certainly functionally linked. Contingent evolution, in which one gene's presence or absence depends upon the presence of another, can also be detected phylogenetically, and may identify genes whose functional significance depends upon its interaction with other genes. Incorporating phylogenetic information improves the prediction of functional linkages. The improvement derives from having a lower rate of false positives and from detecting trends that across-species analyses miss. Phylogenetic methods can easily be incorporated into the screening of large-scale bioinformatics datasets to identify sets of protein links and to characterise gene networks.
Resumo:
A combined mathematical model for predicting heat penetration and microbial inactivation in a solid body heated by conduction was tested experimentally by inoculating agar cylinders with Salmonella typhimurium or Enterococcus faecium and heating in a water bath. Regions of growth where bacteria had survived after heating were measured by image analysis and compared with model predictions. Visualisation of the regions of growth was improved by incorporating chromogenic metabolic indicators into the agar. Preliminary tests established that the model performed satisfactorily with both test organisms and with cylinders of different diameter. The model was then used in simulation studies in which the parameters D, z, inoculum size, cylinder diameter and heating temperature were systematically varied. These simulations showed that the biological variables D, z and inoculum size had a relatively small effect on the time needed to eliminate bacteria at the cylinder axis in comparison with the physical variables heating temperature and cylinder diameter, which had a much greater relative effect. (c) 2005 Elsevier B.V All rights reserved.
Resumo:
A new primary model based on a thermodynamically consistent first-order kinetic approach was constructed to describe non-log-linear inactivation kinetics of pressure-treated bacteria. The model assumes a first-order process in which the specific inactivation rate changes inversely with the square root of time. The model gave reasonable fits to experimental data over six to seven orders of magnitude. It was also tested on 138 published data sets and provided good fits in about 70% of cases in which the shape of the curve followed the typical convex upward form. In the remainder of published examples, curves contained additional shoulder regions or extended tail regions. Curves with shoulders could be accommodated by including an additional time delay parameter and curves with tails shoulders could be accommodated by omitting points in the tail beyond the point at which survival levels remained more or less constant. The model parameters varied regularly with pressure, which may reflect a genuine mechanistic basis for the model. This property also allowed the calculation of (a) parameters analogous to the decimal reduction time D and z, the temperature increase needed to change the D value by a factor of 10, in thermal processing, and hence the processing conditions needed to attain a desired level of inactivation; and (b) the apparent thermodynamic volumes of activation associated with the lethal events. The hypothesis that inactivation rates changed as a function of the square root of time would be consistent with a diffusion-limited process.
Resumo:
Background: People with schizophrenia are more violent than the general population, but this increased risk is attributable to the actions of a small subgroup. Identifying those at risk has become an essential part of clinical practice. Aims: To estimate the risk factors for assault in patients with schizophrenia. Methods: Two hundred seventy-one patients with schizophrenia were interviewed using an extensive battery of instruments. Assault was measured from multiple data sources over the next 2 years and criminal records were obtained. Multiple sociodemographic and clinical variables measured at baseline were examined as possible predictors of assault during follow-up. Results: Sixty-nine (25%) patients committed assault during the 2-year followup. The model that best predicted assault included a history of recent assault (OR 2.33, 95% CI 1.17-4.61), a previous violent conviction (OR 2.02, 95% CI 1.04-3.87), having received special education (OR 2.76, 95% CI 1.22-6.26) and alcohol abuse (OR 3.55, 95% CI 1.24-10.2). Conclusions: Previously established risk factors including a history of violence and alcohol abuse are replicated in this study. Although low premorbid IQ did not predict violence, a need for special education did. (C) 2003 Published by Elsevier B.V.
Resumo:
This paper addresses the need for accurate predictions on the fault inflow, i.e. the number of faults found in the consecutive project weeks, in highly iterative processes. In such processes, in contrast to waterfall-like processes, fault repair and development of new features run almost in parallel. Given accurate predictions on fault inflow, managers could dynamically re-allocate resources between these different tasks in a more adequate way. Furthermore, managers could react with process improvements when the expected fault inflow is higher than desired. This study suggests software reliability growth models (SRGMs) for predicting fault inflow. Originally developed for traditional processes, the performance of these models in highly iterative processes is investigated. Additionally, a simple linear model is developed and compared to the SRGMs. The paper provides results from applying these models on fault data from three different industrial projects. One of the key findings of this study is that some SRGMs are applicable for predicting fault inflow in highly iterative processes. Moreover, the results show that the simple linear model represents a valid alternative to the SRGMs, as it provides reasonably accurate predictions and performs better in many cases.
Resumo:
The hierarchical and "bob" (or branch-on-branch) models are tube-based computational models recently developed for predicting the linear rheology of general mixtures of polydisperse branched polymers. These two models are based on a similar tube-theory framework but differ in their numerical implementation and details of relaxation mechanisms. We present a detailed overview of the similarities and differences of these models and examine the effects of these differences on the predictions of the linear viscoelastic properties of a set of representative branched polymer samples in order to give a general picture of the performance of these models. Our analysis confirms that the hierarchical and bob models quantitatively predict the linear rheology of a wide range of branched polymer melts but also indicate that there is still no unique solution to cover all types of branched polymers without case-by-case adjustment of parameters such as the dilution exponent alpha and the factor p(2) which defines the hopping distance of a branch point relative to the tube diameter. An updated version of the hierarchical model, which shows improved computational efficiency and refined relaxation mechanisms, is introduced and used in these analyses.
Resumo:
The Water Framework Directive has caused a paradigm shift towards the integrated management of recreational water quality through the development of drainage basin-wide programmes of measures. This has increased the need for a cost-effective diagnostic tool capable of accurately predicting riverine faecal indicator organism (FIO) concentrations. This paper outlines the application of models developed to fulfil this need, which represent the first transferrable generic FIO models to be developed for the UK to incorporate direct measures of key FIO sources (namely human and livestock population data) as predictor variables. We apply a recently developed transfer methodology, which enables the quantification of geometric mean presumptive faecal coliforms and presumptive intestinal enterococci concentrations for base- and high-flow during the summer bathing season in unmonitored UK watercourses, to predict FIO concentrations in the Humber river basin district. Because the FIO models incorporate explanatory variables which allow the effects of policy measures which influence livestock stocking rates to be assessed, we carry out empirical analysis of the differential effects of seven land use management and policy instruments (fiscal constraint, production constraint, cost intervention, area intervention, demand-side constraint, input constraint, and micro-level land use management) all of which can be used to reduce riverine FIO concentrations. This research provides insights into FIO source apportionment, explores a selection of pollution remediation strategies and the spatial differentiation of land use policies which could be implemented to deliver river quality improvements. All of the policy tools we model reduce FIO concentrations in rivers but our research suggests that the installation of streamside fencing in intensive milk producing areas may be the single most effective land management strategy to reduce riverine microbial pollution.
Resumo:
Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling cyanobacterial behaviour in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes, reservoirs and rivers. A new deterministic–mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including light, nutrients and temperature. A parameter sensitivity analysis using a one-at-a-time approach was carried out. There were two objectives of the sensitivity analysis presented in this paper: to identify the key parameters controlling the growth and movement patterns of cyanobacteria and to provide a means for model validation. The result of the analysis suggested that maximum growth rate and day length period were the most significant parameters in determining the population growth and colony depth, respectively.
Resumo:
We explore the potential for making statistical decadal predictions of sea surface temperatures (SSTs) in a perfect model analysis, with a focus on the Atlantic basin. Various statistical methods (Lagged correlations, Linear Inverse Modelling and Constructed Analogue) are found to have significant skill in predicting the internal variability of Atlantic SSTs for up to a decade ahead in control integrations of two different global climate models (GCMs), namely HadCM3 and HadGEM1. Statistical methods which consider non-local information tend to perform best, but which is the most successful statistical method depends on the region considered, GCM data used and prediction lead time. However, the Constructed Analogue method tends to have the highest skill at longer lead times. Importantly, the regions of greatest prediction skill can be very different to regions identified as potentially predictable from variance explained arguments. This finding suggests that significant local decadal variability is not necessarily a prerequisite for skillful decadal predictions, and that the statistical methods are capturing some of the dynamics of low-frequency SST evolution. In particular, using data from HadGEM1, significant skill at lead times of 6–10 years is found in the tropical North Atlantic, a region with relatively little decadal variability compared to interannual variability. This skill appears to come from reconstructing the SSTs in the far north Atlantic, suggesting that the more northern latitudes are optimal for SST observations to improve predictions. We additionally explore whether adding sub-surface temperature data improves these decadal statistical predictions, and find that, again, it depends on the region, prediction lead time and GCM data used. Overall, we argue that the estimated prediction skill motivates the further development of statistical decadal predictions of SSTs as a benchmark for current and future GCM-based decadal climate predictions.
Resumo:
The recent decline in the open magnetic flux of the Sun heralds the end of the Grand Solar Maximum (GSM) that has persisted throughout the space age, during which the largest‐fluence Solar Energetic Particle (SEP) events have been rare and Galactic Cosmic Ray (GCR) fluxes have been relatively low. In the absence of a predictive model of the solar dynamo, we here make analogue forecasts by studying past variations of solar activity in order to evaluate how long‐term change in space climate may influence the hazardous energetic particle environment of the Earth in the future. We predict the probable future variations in GCR flux, near‐Earth interplanetary magnetic field (IMF), sunspot number, and the probability of large SEP events, all deduced from cosmogenic isotope abundance changes following 24 GSMs in a 9300‐year record.
Resumo:
The potential of visible-near infrared spectra, obtained using a light backscatter sensor, in conjunction with chemometrics, to predict curd moisture and whey fat content in a cheese vat was examined. A three-factor (renneting temperature, calcium chloride, cutting time), central composite design was carried out in triplicate. Spectra (300–1,100 nm) of the product in the cheese vat were captured during syneresis using a prototype light backscatter sensor. Stirring followed upon cutting the gel, and samples of curd and whey were removed at 10 min intervals and analyzed for curd moisture and whey fat content. Spectral data were used to develop models for predicting curd moisture and whey fat contents using partial least squares regression. Subjecting the spectral data set to Jack-knifing improved the accuracy of the models. The whey fat models (R = 0.91, 0.95) and curd moisture model (R = 0.86, 0.89) provided good and approximate predictions, respectively. Visible-near infrared spectroscopy was found to have potential for the prediction of important syneresis indices in stirred cheese vats.
Resumo:
The objective of this study was to investigate the potential application of mid-infrared spectroscopy for determination of selected sensory attributes in a range of experimentally manufactured processed cheese samples. This study also evaluates mid-infrared spectroscopy against other recently proposed techniques for predicting sensory texture attributes. Processed cheeses (n = 32) of varying compositions were manufactured on a pilot scale. After 2 and 4 wk of storage at 4 degrees C, mid-infrared spectra ( 640 to 4,000 cm(-1)) were recorded and samples were scored on a scale of 0 to 100 for 9 attributes using descriptive sensory analysis. Models were developed by partial least squares regression using raw and pretreated spectra. The mouth-coating and mass-forming models were improved by using a reduced spectral range ( 930 to 1,767 cm(-1)). The remaining attributes were most successfully modeled using a combined range ( 930 to 1,767 cm(-1) and 2,839 to 4,000 cm(-1)). The root mean square errors of cross-validation for the models were 7.4(firmness; range 65.3), 4.6 ( rubbery; range 41.7), 7.1 ( creamy; range 60.9), 5.1(chewy; range 43.3), 5.2(mouth-coating; range 37.4), 5.3 (fragmentable; range 51.0), 7.4 ( melting; range 69.3), and 3.1 (mass-forming; range 23.6). These models had a good practical utility. Model accuracy ranged from approximate quantitative predictions to excellent predictions ( range error ratio = 9.6). In general, the models compared favorably with previously reported instrumental texture models and near-infrared models, although the creamy, chewy, and melting models were slightly weaker than the previously reported near-infrared models. We concluded that mid-infrared spectroscopy could be successfully used for the nondestructive and objective assessment of processed cheese sensory quality..
Resumo:
Pollination is one of the most important ecosystem services in agroecosystems and supports food production. Pollinators are potentially at risk being exposed to pesticides and the main route of exposure is direct contact, in some cases ingestion, of contaminated materials such as pollen, nectar, flowers and foliage. To date there are no suitable methods for predicting pesticide exposure for pollinators, therefore official procedures to assess pesticide risk are based on a Hazard Quotient. Here we develop a procedure to assess exposure and risk for pollinators based on the foraging behaviour of honeybees (Apis mellifera) and using this species as indicator representative of pollinating insects. The method was applied in 13 European field sites with different climatic, landscape and land use characteristics. The level of risk during the crop growing season was evaluated as a function of the active ingredients used and application regime. Risk levels were primarily determined by the agronomic practices employed (i.e. crop type, pest control method, pesticide use), and there was a clear temporal partitioning of risks through time. Generally the risk was higher in sites cultivated with permanent crops, such as vineyard and olive, than in annual crops, such as cereals and oil seed rape. The greatest level of risk is generally found at the beginning of the growing season for annual crops and later in June–July for permanent crops.
Resumo:
The IPD Annual Index is the largest and most comprehensive Real Estate market index available in the UK Such coverage however inevitably leads to delays in publication. In contrast there are a number of quarterly and monthly indices which are published within days of the year end but which lack the coverage in terms of size and numbers of properties. This paper analyses these smaller but more timely indices to see whether such indices can be used to predict the performance of the IPD Annual Index. Using a number of measures of forecasting accuracy it is shown that the smaller indices provide unbiased and efficient predictions of the IPD Annual Index. Such indices also significantly outperform a naive no-change model. Although no one index performs significantly better than the others. The more timely indices however do not perfectly track the IPD Annual Index. As a result any short run predictions of performance will be subject to a degree of error. Nevertheless the more timely indices, although lacking authoritative coverage, provide a valuable service to investors giving good estimates of Real Estates performance well before the publication of the IPD Annual Index.