916 resultados para continuous-resource model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A continuous version of the hierarchical spherical model at dimension d=4 is investigated. Two limit distributions of the block spin variable X(gamma), normalized with exponents gamma = d + 2 and gamma=d at and above the critical temperature, are established. These results are proven by solving certain evolution equations corresponding to the renormalization group (RG) transformation of the O(N) hierarchical spin model of block size L(d) in the limit L down arrow 1 and N ->infinity. Starting far away from the stationary Gaussian fixed point the trajectories of these dynamical system pass through two different regimes with distinguishable crossover behavior. An interpretation of this trajectories is given by the geometric theory of functions which describe precisely the motion of the Lee-Yang zeroes. The large-N limit of RG transformation with L(d) fixed equal to 2, at the criticality, has recently been investigated in both weak and strong (coupling) regimes by Watanabe (J. Stat. Phys. 115:1669-1713, 2004) . Although our analysis deals only with N = infinity case, it complements various aspects of that work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nonequilibrium phase transition of the one-dimensional triplet-creation model is investigated using the n-site approximation scheme. We find that the phase diagram in the space of parameters (gamma, D), where gamma is the particle decay probability and D is the diffusion probability, exhibits a tricritical point for n >= 4. However, the fitting of the tricritical coordinates (gamma(t), D(t)) using data for 4 <= n <= 13 predicts that gamma(t) becomes negative for n >= 26, indicating thus that the phase transition is always continuous in the limit n -> infinity. However, the large discrepancies between the critical parameters obtained in this limit and those obtained by Monte Carlo simulations, as well as a puzzling non-monotonic dependence of these parameters on the order of the approximation n, argue for the inadequacy of the n-site approximation to study the triplet-creation model for computationally feasible values of n.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider consider the problem of dichotomizing a continuous covariate when performing a regression analysis based on a generalized estimation approach. The problem involves estimation of the cutpoint for the covariate and testing the hypothesis that the binary covariate constructed from the continuous covariate has a significant impact on the outcome. Due to the multiple testing used to find the optimal cutpoint, we need to make an adjustment to the usual significance test to preserve the type-I error rates. We illustrate the techniques on one data set of patients given unrelated hematopoietic stem cell transplantation. Here the question is whether the CD34 cell dose given to patient affects the outcome of the transplant and what is the smallest cell dose which is needed for good outcomes. (C) 2010 Elsevier BM. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have considered a Bayesian approach for the nonlinear regression model by replacing the normal distribution on the error term by some skewed distributions, which account for both skewness and heavy tails or skewness alone. The type of data considered in this paper concerns repeated measurements taken in time on a set of individuals. Such multiple observations on the same individual generally produce serially correlated outcomes. Thus, additionally, our model does allow for a correlation between observations made from the same individual. We have illustrated the procedure using a data set to study the growth curves of a clinic measurement of a group of pregnant women from an obstetrics clinic in Santiago, Chile. Parameter estimation and prediction were carried out using appropriate posterior simulation schemes based in Markov Chain Monte Carlo methods. Besides the deviance information criterion (DIC) and the conditional predictive ordinate (CPO), we suggest the use of proper scoring rules based on the posterior predictive distribution for comparing models. For our data set, all these criteria chose the skew-t model as the best model for the errors. These DIC and CPO criteria are also validated, for the model proposed here, through a simulation study. As a conclusion of this study, the DIC criterion is not trustful for this kind of complex model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the method of forcing we construct a model for ZFC where CH does not hold and where there exists a connected compact topological space K of weight omega(1) < 2(omega) such that every operator on the Banach space of continuous functions on K is multiplication by a continuous function plus a weakly compact operator. In particular, the Banach space of continuous functions on K is indecomposable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Setup time reduction facilitate the flexibility needed for just-in-time production. An integrated steel mill with meltshop, continuous caster and hot rolling mill is often operated as decoupled processes. Setup time reduction provides the flexibility needed to reduce buffering, shorten lead times and create an integrated process flow. The interdependency of setup times, process flexibility and integration were analysed through system dynamics simulation. The results showed significant reductions of energy consumption and tied capital. It was concluded that setup time reduction in the hot strip mill can aid process integration and hence improve production economy while reducing environmental impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background. Continuous subcutaneous insulin infusion (CSII) treatment among children with type 1 diabetes is increasing in Sweden. However, studies evaluating glycaemic control in children using CSII show inconsistent results. Omitting bolus insulin doses using CSII may cause reduced glycaemic control among adolescents. The distribution of responsibility for diabetes self-management between children and parents is often unclear and needs clarification. There is much published support for continued parental involvement and shared diabetes management during adolescence. Guided Self-Determination (GSD) is an empowerment-based, person-centred, reflection and problem solving method intended to guide the patient to become self-sufficient and develop life skills for managing difficulties in diabetes self-management. This method has been adapted for adolescents and parents as Guided Self-Determination-Young (GSD-Y). This study aims to evaluate the effect of an intervention with GSD-Y in groups of adolescents starting on insulin pumps and their parents on diabetes-related family conflicts, perceived health and quality of life (QoL), and metabolic control. Here, we describe the protocol and plans for study enrolment. Methods. This study is designed as a randomized, controlled, prospective, multicentre study. Eighty patients between 12-18 years of age who are planning to start CSII will be included. All adolescents and their parents will receive standard insulin pump training. The education intervention will be conducted when CSII is to be started and at four appointments in the first 4 months after starting CSII. The primary outcome is haemoglobin A1c levels. Secondary outcomes are perceived health and QoL, frequency of blood glucose self-monitoring and bolus doses, and usage of carbohydrate counting. The following instruments will be used to evaluate perceived health and QoL: Disabkids, 'Check your health', the Diabetes Family Conflict Scale and the Swedish Diabetes Empowerment Scale. Outcomes will be evaluated within and between groups by comparing data at baseline, and at 6 and 12 months after starting treatment. Results and discussion. In this study, we will assess the effect of starting an insulin pump together with the model of Guided Self-Determination to determine whether this approach leads to retention of improved glycaemic control, QoL, responsibility distribution and reduced diabetes-related conflicts in the family. Trial registration: Current controlled trials: ISRCTN22444034

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A description of a data item's provenance can be provided in dierent forms, and which form is best depends on the intended use of that description. Because of this, dierent communities have made quite distinct underlying assumptions in their models for electronically representing provenance. Approaches deriving from the library and archiving communities emphasise agreed vocabulary by which resources can be described and, in particular, assert their attribution (who created the resource, who modied it, where it was stored etc.) The primary purpose here is to provide intuitive metadata by which users can search for and index resources. In comparison, models for representing the results of scientific workflows have been developed with the assumption that each event or piece of intermediary data in a process' execution can and should be documented, to give a full account of the experiment undertaken. These occurrences are connected together by stating where one derived from, triggered, or otherwise caused another, and so form a causal graph. Mapping between the two approaches would be benecial in integrating systems and exploiting the strengths of each. In this paper, we specify such a mapping between Dublin Core and the Open Provenance Model. We further explain the technical issues to overcome and the rationale behind the approach, to allow the same method to apply in mapping similar schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reliable evaluation of the flood forecasting is a crucial problem for assessing flood risk and consequent damages. Different hydrological models (distributed, semi-distributed or lumped) have been proposed in order to deal with this issue. The choice of the proper model structure has been investigated by many authors and it is one of the main sources of uncertainty for a correct evaluation of the outflow hydrograph. In addition, the recent increasing of data availability makes possible to update hydrological models as response of real-time observations. For these reasons, the aim of this work it is to evaluate the effect of different structure of a semi-distributed hydrological model in the assimilation of distributed uncertain discharge observations. The study was applied to the Bacchiglione catchment, located in Italy. The first methodological step was to divide the basin in different sub-basins according to topographic characteristics. Secondly, two different structures of the semi-distributed hydrological model were implemented in order to estimate the outflow hydrograph. Then, synthetic observations of uncertain value of discharge were generated, as a function of the observed and simulated value of flow at the basin outlet, and assimilated in the semi-distributed models using a Kalman Filter. Finally, different spatial patterns of sensors location were assumed to update the model state as response of the uncertain discharge observations. The results of this work pointed out that, overall, the assimilation of uncertain observations can improve the hydrologic model performance. In particular, it was found that the model structure is an important factor, of difficult characterization, since can induce different forecasts in terms of outflow discharge. This study is partly supported by the FP7 EU Project WeSenseIt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Renewable energy production is a basic supplement to stabilize rapidly increasing global energy demand and skyrocketing energy price as well as to balance the fluctuation of supply from non-renewable energy sources at electrical grid hubs. The European energy traders, government and private company energy providers and other stakeholders have been, since recently, a major beneficiary, customer and clients of Hydropower simulation solutions. The relationship between rainfall-runoff model outputs and energy productions of hydropower plants has not been clearly studied. In this research, association of rainfall, catchment characteristics, river network and runoff with energy production of a particular hydropower station is examined. The essence of this study is to justify the correspondence between runoff extracted from calibrated catchment and energy production of hydropower plant located at a catchment outlet; to employ a unique technique to convert runoff to energy based on statistical and graphical trend analysis of the two, and to provide environment for energy forecast. For rainfall-runoff model setup and calibration, MIKE 11 NAM model is applied, meanwhile MIKE 11 SO model is used to track, adopt and set a control strategy at hydropower location for runoff-energy correlation. The model is tested at two selected micro run-of-river hydropower plants located in South Germany. Two consecutive calibration is compromised to test the model; one for rainfall-runoff model and other for energy simulation. Calibration results and supporting verification plots of two case studies indicated that simulated discharge and energy production is comparable with the measured discharge and energy production respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model Predictive Control (MPC) is a control method that solves in real time an optimal control problem over a finite horizon. The finiteness of the horizon is both the reason of MPC's success and its main limitation. In operational water resources management, MPC has been in fact successfully employed for controlling systems with a relatively short memory, such as canals, where the horizon length is not an issue. For reservoirs, which have generally a longer memory, MPC applications are presently limited to short term management only. Short term reservoir management can be effectively used to deal with fast process, such as floods, but it is not capable of looking sufficiently ahead to handle long term issues, such as drought. To overcome this limitation, we propose an Infinite Horizon MPC (IH-MPC) solution that is particularly suitable for reservoir management. We propose to structure the input signal by use of orthogonal basis functions, therefore reducing the optimization argument to a finite number of variables, and making the control problem solvable in a reasonable time. We applied this solution for the management of the Manantali Reservoir. Manantali is a yearly reservoir located in Mali, on the Senegal river, affecting water systems of Mali, Senegal, and Mauritania. The long term horizon offered by IH-MPC is necessary to deal with the strongly seasonal climate of the region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital elevation model (DEM) plays a substantial role in hydrological study, from understanding the catchment characteristics, setting up a hydrological model to mapping the flood risk in the region. Depending on the nature of study and its objectives, high resolution and reliable DEM is often desired to set up a sound hydrological model. However, such source of good DEM is not always available and it is generally high-priced. Obtained through radar based remote sensing, Shuttle Radar Topography Mission (SRTM) is a publicly available DEM with resolution of 92m outside US. It is a great source of DEM where no surveyed DEM is available. However, apart from the coarse resolution, SRTM suffers from inaccuracy especially on area with dense vegetation coverage due to the limitation of radar signals not penetrating through canopy. This will lead to the improper setup of the model as well as the erroneous mapping of flood risk. This paper attempts on improving SRTM dataset, using Normalised Difference Vegetation Index (NDVI), derived from Visible Red and Near Infra-Red band obtained from Landsat with resolution of 30m, and Artificial Neural Networks (ANN). The assessment of the improvement and the applicability of this method in hydrology would be highlighted and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the service life of water supply network (WSN) growth, the growing phenomenon of aging pipe network has become exceedingly serious. As urban water supply network is hidden underground asset, it is difficult for monitoring staff to make a direct classification towards the faults of pipe network by means of the modern detecting technology. In this paper, based on the basic property data (e.g. diameter, material, pressure, distance to pump, distance to tank, load, etc.) of water supply network, decision tree algorithm (C4.5) has been carried out to classify the specific situation of water supply pipeline. Part of the historical data was used to establish a decision tree classification model, and the remaining historical data was used to validate this established model. Adopting statistical methods were used to access the decision tree model including basic statistical method, Receiver Operating Characteristic (ROC) and Recall-Precision Curves (RPC). These methods has been successfully used to assess the accuracy of this established classification model of water pipe network. The purpose of classification model was to classify the specific condition of water pipe network. It is important to maintain the pipeline according to the classification results including asset unserviceable (AU), near perfect condition (NPC) and serious deterioration (SD). Finally, this research focused on pipe classification which plays a significant role in maintaining water supply networks in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the development and performance of a low-power sensor node (hardware, software and algorithms) that autonomously controls the sampling interval of a suite of sensors based on local state estimates and future predictions of water flow. The problem is motivated by the need to accurately reconstruct abrupt state changes in urban watersheds and stormwater systems. Presently, the detection of these events is limited by the temporal resolution of sensor data. It is often infeasible, however, to increase measurement frequency due to energy and sampling constraints. This is particularly true for real-time water quality measurements, where sampling frequency is limited by reagent availability, sensor power consumption, and, in the case of automated samplers, the number of available sample containers. These constraints pose a significant barrier to the ubiquitous and cost effective instrumentation of large hydraulic and hydrologic systems. Each of our sensor nodes is equipped with a low-power microcontroller and a wireless module to take advantage of urban cellular coverage. The node persistently updates a local, embedded model of flow conditions while IP-connectivity permits each node to continually query public weather servers for hourly precipitation forecasts. The sampling frequency is then adjusted to increase the likelihood of capturing abrupt changes in a sensor signal, such as the rise in the hydrograph – an event that is often difficult to capture through traditional sampling techniques. Our architecture forms an embedded processing chain, leveraging local computational resources to assess uncertainty by analyzing data as it is collected. A network is presently being deployed in an urban watershed in Michigan and initial results indicate that the system accurately reconstructs signals of interest while significantly reducing energy consumption and the use of sampling resources. We also expand our analysis by discussing the role of this approach for the efficient real-time measurement of stormwater systems.