890 resultados para continuous model theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper extends the build-operate-transfer (BOT) concession model (BOTCcM) to a new method for identifying a concession period by using bargaining-game theory. Concession period is one of the most important decision variables in arranging a BOT-type contract, and there are few methodologies available for helping to determine the value of this variable. The BOTCcM presents an alternative method by which a group of concession period solutions are produced. Nevertheless, a typical weakness in using BOTCcM is that the model cannot recommend a specific time span for concessionary. This paper introduces a new method called BOT bargaining concession model (BOTBaC) to enable the identification of a specific concession period, which takes into account the bargaining behavior of the two parties concerned in engaging a BOT contract, namely, the investor and the government concerned. The application of BOTBaC is demonstrated through using an example case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents in detail a theoretical adaptive model of thermal comfort based on the “Black Box” theory, taking into account factors such as culture, climate, social, psychological and behavioural adaptations, which have an impact on the senses used to detect thermal comfort. The model is called the Adaptive Predicted Mean Vote (aPMV) model. The aPMV model explains, by applying the cybernetics concept, the phenomena that the Predicted Mean Vote (PMV) is greater than the Actual Mean Vote (AMV) in free-running buildings, which has been revealed by many researchers in field studies. An Adaptive coefficient (λ) representing the adaptive factors that affect the sense of thermal comfort has been proposed. The empirical coefficients in warm and cool conditions for the Chongqing area in China have been derived by applying the least square method to the monitored onsite environmental data and the thermal comfort survey results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individuals with elevated levels of plasma low density lipoprotein (LDL) cholesterol (LDL-C) are considered to be at risk of developing coronary heart disease. LDL particles are removed from the blood by a process known as receptor-mediated endocytosis, which occurs mainly in the liver. A series of classical experiments delineated the major steps in the endocytotic process; apolipoprotein B-100 present on LDL particles binds to a specific receptor (LDL receptor, LDL-R) in specialized areas of the cell surface called clathrin-coated pits. The pit comprising the LDL-LDL-R complex is internalized forming a cytoplasmic endosome. Fusion of the endosome with a lysosome leads to degradation of the LDL into its constituent parts (that is, cholesterol, fatty acids, and amino acids), which are released for reuse by the cell, or are excreted. In this paper, we formulate a mathematical model of LDL endocytosis, consisting of a system of ordinary differential equations. We validate our model against existing in vitro experimental data, and we use it to explore differences in system behavior when a single bolus of extracellular LDL is supplied to cells, compared to when a continuous supply of LDL particles is available. Whereas the former situation is common to in vitro experimental systems, the latter better reflects the in vivo situation. We use asymptotic analysis and numerical simulations to study the longtime behavior of model solutions. The implications of model-derived insights for experimental design are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Truly continuous solid-state fermentations with operating times of 2-3 weeks were conducted in a prototype bioreactor for the production of fungal (Penicillium glabrum) tannase from a tannin-containing model substrate. Substantial quantities of the enzyme were synthesized throughout the operating periods and (imperfect) steady-state conditions seemed to be achieved soon after start-up of the fermentations. This demonstrated for the first time the possibility of conducting solid-state fermentations in the continuous mode and with a constant noninoculated feed. The operating variables and fermentation conditions in the bioreactor were sufficiently well predicted for the basic reinoculation concept to succeed. However, an incomplete understanding of the microbial mechanisms, the experimental system, and their interaction indicated the need for more research in this novel area of solid-state fermentation. (C) 2004 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In vitro fermentations were carried out by using a model of the human colon to simulate microbial activities of lower gut bacteria. Bacterial populations (and their metabolic products) were evaluated under the effects of various fermentable substrates. Carbohydrates tested were polydextrose, lactitol, and fructo-oligosaccharide (FOS). Bacterial groups of interest were evaluated by fluorescence in situ hybridization as well as by species-specific PCR to determine bifidobacterial species and percent-G+C profiling of the bacterial communities present. Short-chain fatty acids (SCFA) produced during the fermentations were also evaluated. Polydextrose had a stimulatory effect upon colonic bifidobacteria at concentrations of 1 and 2% (using a single and pooled human fecal inoculum, respectively). The bifidogenic effect was sustained throughout all three vessels of the in vitro system (P = 0.01 seen in vessel 3), as corroborated by the bacterial community profile revealed by %G+C analysis. This substrate supported a wide variety of bifidobacteria and was the only substrate where Bifidobacterium infantis was detected. The fermentation of lactitol had a deleterious effect on both bifidobacterial and bacteroides populations (P = 0.01) and decreased total cell numbers. SCFA production was stimulated, however, particularly butyrate (beneficial for host colonocytes). FOS also had a stimulatory effect upon bifidobacterial and lactobacilli populations that used a single inoculum (P = 0.01 for all vessels) as well as a bifidogenic effect in vessels 2 and 3 (P = 0.01) when a pooled inoculum was used. A decrease in bifidobacteria throughout the model was reflected in the percent-G+C profiles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims: Certain milk factors may promote the growth of a gastrointestinal microflora predominated by bifidobacteria and may aid in overcoming enteric infections. This may explain why breast-fed infants experience fewer intestinal infections than their formula-fed counterparts. The effect of formula supplementation with two such factors was investigated in this study. Methods and Results: Infant faecal specimens were used to ferment formulae supplemented with glycomacropeptide (GMP) and alpha-lactalbumin (alpha-la) in a two-stage compound continuous culture model. At steady state, all fermenter vessels were inoculated with 5 ml of 0.1 M phosphate-buffered saline (pH 7.2) containing 10(8) CFU ml(-1) of either enteropathogenic Escherichia coli 2348/69 (O127:H6) or Salmonella serotype Typhimurium (DSMZ 5569). Bacteriology was determined by independent fluorescence in situ hybridization. Vessels that contained breast milk (BM), as well as alpha-la and GMP supplemented formula had stable total counts of bifidobacteria while lactobacilli increased significantly only in vessels with breast milk. Bacteroides, clostridia and E. coli decreased significantly in all three groups prior to pathogen addition. Escherichia coli counts decreased in vessels containing BM and alpha-la while Salmonella decreased significantly in all vessels containing BM, alpha-la and GMP. Acetate was the predominant acid. Significance and Impact of the Study: Supplementation of infant formulae with appropriate milk proteins may be useful in mimicking the beneficial bacteriological effects of breast milk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE. To investigate the nature of early ocular misalignments in human infants to determine whether they can provide insight into the etiology of esotropia and, in particular, to examine the correlates of misalignments. METHODS. A remote haploscopic photorefraction system was used to measure accommodation and vergence in 146 infants between 0 and 12 months of age. Infants underwent photorefraction immediately after watching a target moving between two of five viewing distances (25, 33, 50, 100, and 200 cm). In some instances, infants were tested in two conditions: both eyes open and one eye occluded. The resultant data were screened for instances of large misalignments. Data were assessed to determine whether accommodative, retinal disparity, or other cues were associated with the occurrence of misalignments. RESULTS. The results showed that there was no correlation between accommodative behavior and misalignments. Infants were more likely to show misalignments when retinal disparity cues were removed through occlusion. They were also more likely to show misalignments immediately after the target moved from a near to a far position in comparison to far-to-near target movement. DISCUSSION. The data suggest that the prevalence of misalignments in infants of 2 to 3 months of age is decreased by the addition of retinal disparity cues to the stimulus. In addition, target movement away from the infant increases the prevalence of misalignments. These data are compatible with the notion that misalignment are caused by poor sensitivity to targets moving away from the infant and support the theory that some forms of strabismus could be related to failure in a system that is sensitive to the direction of motion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present an on-line estimation algorithm for an uncertain time delay in a continuous system based on the observational input-output data, subject to observational noise. The first order Pade approximation is used to approximate the time delay. At each time step, the algorithm combines the well known Kalman filter algorithm and the recursive instrumental variable least squares (RIVLS) algorithm in cascade form. The instrumental variable least squares algorithm is used in order to achieve the consistency of the delay parameter estimate, since an error-in-the-variable model is involved. An illustrative example is utilized to demonstrate the efficacy of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The emergent requirements for effective e-learning calls for a paradigm shift for instructional design. Constructivist theory and semiotics offer a sound underpinning to enable such revolutionary change by employing the concepts of Learning Objects. E-learning guidelines adopted by the industry have led successfully to the development of training materials. Inadequacy and deficiency of those methods for Higher Education have been identified in this paper. Based on the best practice in industry and our empirical research, we present an instructional design model with practical templates for constructivist learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large scientific applications are usually developed, tested and used by a group of geographically dispersed scientists. The problems associated with the remote development and data sharing could be tackled by using collaborative working environments. There are various tools and software to create collaborative working environments. Some software frameworks, currently available, use these tools and software to enable remote job submission and file transfer on top of existing grid infrastructures. However, for many large scientific applications, further efforts need to be put to prepare a framework which offers application-centric facilities. Unified Air Pollution Model (UNI-DEM), developed by Danish Environmental Research Institute, is an example of a large scientific application which is in a continuous development and experimenting process by different institutes in Europe. This paper intends to design a collaborative distributed computing environment for UNI-DEM in particular but the framework proposed may also fit to many large scientific applications as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm based on the basis pursuit that minimises the l(1) norm of the parameter estimate vector. The model subset selection cost function includes a D-optimality design criterion that maximises the determinant of the design matrix of the subset to ensure model robustness and to enable the model selection procedure to automatically terminate at a sparse model. The proposed approach is based on the forward OLS algorithm using the modified Gram-Schmidt procedure. Both the parameter tuning procedure, based on basis pursuit, and the model selection criterion, based on the D-optimality that is effective in ensuring model robustness, are integrated with the forward regression. As a consequence the inherent computational efficiency associated with the conventional forward OLS approach is maintained in the proposed algorithm. Examples demonstrate the effectiveness of the new approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new identification algorithm is introduced for the Hammerstein model consisting of a nonlinear static function followed by a linear dynamical model. The nonlinear static function is characterised by using the Bezier-Bernstein approximation. The identification method is based on a hybrid scheme including the applications of the inverse of de Casteljau's algorithm, the least squares algorithm and the Gauss-Newton algorithm subject to constraints. The related work and the extension of the proposed algorithm to multi-input multi-output systems are discussed. Numerical examples including systems with some hard nonlinearities are used to illustrate the efficacy of the proposed approach through comparisons with other approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An automatic nonlinear predictive model-construction algorithm is introduced based on forward regression and the predicted-residual-sums-of-squares (PRESS) statistic. The proposed algorithm is based on the fundamental concept of evaluating a model's generalisation capability through crossvalidation. This is achieved by using the PRESS statistic as a cost function to optimise model structure. In particular, the proposed algorithm is developed with the aim of achieving computational efficiency, such that the computational effort, which would usually be extensive in the computation of the PRESS statistic, is reduced or minimised. The computation of PRESS is simplified by avoiding a matrix inversion through the use of the orthogonalisation procedure inherent in forward regression, and is further reduced significantly by the introduction of a forward-recursive formula. Based on the properties of the PRESS statistic, the proposed algorithm can achieve a fully automated procedure without resort to any other validation data set for iterative model evaluation. Numerical examples are used to demonstrate the efficacy of the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of four operational weather forecast models [ECMWF, Action de Recherche Petite Echelle Grande Echelle model (ARPEGE), Regional Atmospheric Climate Model (RACMO), and Met Office] to generate a cloud at the right location and time (the cloud frequency of occurrence) is assessed in the present paper using a two-year time series of observations collected by profiling ground-based active remote sensors (cloud radar and lidar) located at three different sites in western Europe (Cabauw. Netherlands; Chilbolton, United Kingdom; and Palaiseau, France). Particular attention is given to potential biases that may arise from instrumentation differences (especially sensitivity) from one site to another and intermittent sampling. In a second step the statistical properties of the cloud variables involved in most advanced cloud schemes of numerical weather forecast models (ice water content and cloud fraction) are characterized and compared with their counterparts in the models. The two years of observations are first considered as a whole in order to evaluate the accuracy of the statistical representation of the cloud variables in each model. It is shown that all models tend to produce too many high-level clouds, with too-high cloud fraction and ice water content. The midlevel and low-level cloud occurrence is also generally overestimated, with too-low cloud fraction but a correct ice water content. The dataset is then divided into seasons to evaluate the potential of the models to generate different cloud situations in response to different large-scale forcings. Strong variations in cloud occurrence are found in the observations from one season to the same season the following year as well as in the seasonal cycle. Overall, the model biases observed using the whole dataset are still found at seasonal scale, but the models generally manage to well reproduce the observed seasonal variations in cloud occurrence. Overall, models do not generate the same cloud fraction distributions and these distributions do not agree with the observations. Another general conclusion is that the use of continuous ground-based radar and lidar observations is definitely a powerful tool for evaluating model cloud schemes and for a responsive assessment of the benefit achieved by changing or tuning a model cloud

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – While Freeman's stakeholder management approach has attracted much attention from both scholars and practitioners, little empirical work has considered the interconnectedness of organisational perspectives and stakeholder perspectives. The purpose of this paper is to respond to this gap by developing and empirically testing a bi-directional model of organisation/stakeholder relationships. Design/methodology/approach – A conceptual framework is developed that integrates how stakeholders are affected by organisations with how they affect organisations. Quantitative data relating to both sides of the relationship are obtained from 700 customers of a European service organisation and analysed using partial least squares structural equation modelling technique. Findings – The findings provide empirical support for the notion of mutual dependency between organisations and stakeholders as advocated by stakeholder theorists. The results suggest that the way stakeholders relate to organisations is dependent on how organisations relate to stakeholders. Originality/value – The study is original on two fronts: first, it provides a framework and process that can be used by researchers to model bi-directional research with other stakeholder groups and in different contexts. Second, the study presents an example application of bi-directional research by empirically linking organisational and stakeholder expectations in the case of customers of a UK service organisation.