975 resultados para conventional model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooperatives, as a kind of firms, are considered by many scholars as an remarkable alternative for overcoming the economic crisis started in 2008. Besides, there are other scholars which pointed out the important role that these firms play in the regional economic development. Nevertheless, when one examines the economic literature on cooperatives, it is detected that this kind of firms is mainly studied starting from the point of view of their own characteristics and particularities of participation and solidarity. In this sense, following a different analysis framework, this article proposes a theoretical model in order to explain the behavior of cooperatives based on the entrepreneurship theory with the aim of increasing the knowledge about this kind of firms and, more specifically, their contribution to regional economic development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper exposes the strengths and weaknesses of the recently proposed velocity-based local model (LM) network. The global dynamics of the velocity-based blended representation are directly related to the dynamics of the underlying local models, an important property in the design of local controller networks. Furthermore, the sub-models are continuous-time and linear providing continuity with established linear theory and methods. This is not true for the conventional LM framework, where the global dynamics are only weakly related to the affine sub-models. In this paper, a velocity-based multiple model network is identified for a highly nonlinear dynamical system. The results show excellent dynamical modelling performances, highlighting the value of the velocity-based approach for the design and analysis of LM based control. Three important practical issues are also addressed. These relate to the blending of the velocity-based local models, the use of normalised Gaussian basis functions and the requirement of an input derivative.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extending the work presented in Prasad et al. (IEEE Proceedings on Control Theory and Applications, 147, 523-37, 2000), this paper reports a hierarchical nonlinear physical model-based control strategy to account for the problems arising due to complex dynamics of drum level and governor valve, and demonstrates its effectiveness in plant-wide disturbance handling. The strategy incorporates a two-level control structure consisting of lower-level conventional PI regulators and a higher-level nonlinear physical model predictive controller (NPMPC) for mainly set-point manoeuvring. The lower-level PI loops help stabilise the unstable drum-boiler dynamics and allow faster governor valve action for power and grid-frequency regulation. The higher-level NPMPC provides an optimal load demand (or set-point) transition by effective handling of plant-wide interactions and system disturbances. The strategy has been tested in a simulation of a 200-MW oil-fired power plant at Ballylumford in Northern Ireland. A novel approach is devized to test the disturbance rejection capability in severe operating conditions. Low frequency disturbances were created by making random changes in radiation heat flow on the boiler-side, while condenser vacuum was fluctuating in a random fashion on the turbine side. In order to simulate high-frequency disturbances, pulse-type load disturbances were made to strike at instants which are not an integral multiple of the NPMPC sampling period. Impressive results have been obtained during both types of system disturbances and extremely high rates of load changes, right across the operating range, These results compared favourably with those from a conventional state-space generalized predictive control (GPC) method designed under similar conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a practical algorithm for the simulation of interactive deformation in a 3D polygonal mesh model. The algorithm combines the conventional simulation of deformation using a spring-mass-damping model, solved by explicit numerical integration, with a set of heuristics to describe certain features of the transient behaviour, to increase the speed and stability of solution. In particular, this algorithm was designed to be used in the simulation of synthetic environments where it is necessary to model realistically, in real time, the effect on non-rigid surfaces being touched, pushed, pulled or squashed. Such objects can be solid or hollow, and have plastic, elastic or fabric-like properties. The algorithm is presented in an integrated form including collision detection and adaptive refinement so that it may be used in a self-contained way as part of a simulation loop to include human interface devices that capture data and render a realistic stereoscopic image in real time. The algorithm is designed to be used with polygonal mesh models representing complex topology, such as the human anatomy in a virtual-surgery training simulator. The paper evaluates the model behaviour qualitatively and then concludes with some examples of the use of the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In polymer extrusion, delivery of a melt which is homogenous in composition and temperature is important for good product quality. However, the process is inherently prone to temperature fluctuations which are difficult to monitor and control via single point based conventional thermo- couples. In this work, the die melt temperature profile was monitored by a thermocouple mesh and the data obtained was used to generate a model to predict the die melt temperature profile. A novel nonlinear model was then proposed which was demonstrated to be in good agreement with training and unseen data. Furthermore, the proposed model was used to select optimum process settings to achieve the desired average melt temperature across the die while improving the temperature homogeneity. The simulation results indicate a reduction in melt temperature variations of up to 60%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose an exchange rate model that is a hybrid of the conventional specification with monetary fundamentals and the Evans–Lyons microstructure approach. We estimate a model augmented with order flow variables, using a unique data set: almost 100 monthly observations on interdealer order flow on dollar/euro and dollar/yen. The augmented macroeconomic, or “hybrid,” model exhibits greater in-sample stability and out of sample forecasting improvement vis-à-vis the basic macroeconomic and random walk specifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In studies of radiation-induced DNA fragmentation and repair, analytical models may provide rapid and easy-to-use methods to test simple hypotheses regarding the breakage and rejoining mechanisms involved. The random breakage model, according to which lesions are distributed uniformly and independently of each other along the DNA, has been the model most used to describe spatial distribution of radiation-induced DNA damage. Recently several mechanistic approaches have been proposed that model clustered damage to DNA. In general, such approaches focus on the study of initial radiation-induced DNA damage and repair, without considering the effects of additional (unwanted and unavoidable) fragmentation that may take place during the experimental procedures. While most approaches, including measurement of total DNA mass below a specified value, allow for the occurrence of background experimental damage by means of simple subtractive procedures, a more detailed analysis of DNA fragmentation necessitates a more accurate treatment. We have developed a new, relatively simple model of DNA breakage and the resulting rejoining kinetics of broken fragments. Initial radiation-induced DNA damage is simulated using a clustered breakage approach, with three free parameters: the number of independently located clusters, each containing several DNA double-strand breaks (DSBs), the average number of DSBs within a cluster (multiplicity of the cluster), and the maximum allowed radius within which DSBs belonging to the same cluster are distributed. Random breakage is simulated as a special case of the DSB clustering procedure. When the model is applied to the analysis of DNA fragmentation as measured with pulsed-field gel electrophoresis (PFGE), the hypothesis that DSBs in proximity rejoin at a different rate from that of sparse isolated breaks can be tested, since the kinetics of rejoining of fragments of varying size may be followed by means of computer simulations. The problem of how to account for background damage from experimental handling is also carefully considered. We have shown that the conventional procedure of subtracting the background damage from the experimental data may lead to erroneous conclusions during the analysis of both initial fragmentation and DSB rejoining. Despite its relative simplicity, the method presented allows both the quantitative and qualitative description of radiation-induced DNA fragmentation and subsequent rejoining of double-stranded DNA fragments. (C) 2004 by Radiation Research Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. A more general contingency model of optimal diet choice is developed, allowing for simultaneous searching and handling, which extends the theory to include grazing and browsing by large herbivores.</p><p>2. Foraging resolves into three modes: purely encounter-limited, purely handling-limited and mixed-process, in which either a handling-limited prey type is added to an encounter-limited diet, or the diet becomes handling-limited as it expands.</p><p>3. The purely encounter-limited diet is, in general, broader than that predicted by the conventional contingency model,</p><p>4. As the degree of simultaneity of searching and handling increases, the optimal diet expands to the point where it is handling-limited, at which point all inferior prey types are rejected,</p><p>5. Inclusion of a less profitable prey species is not necessarily independent of its encounter rate and the zero-one rule does not necessarily hold: some of the less profitable prey may be included in the optimal diet. This gives an optimal foraging explanation for herbivores' mixed diets.</p><p>6. Rules are shown for calculating the boundary between encounter-limited and handling-limited diets and for predicting the proportion of inferior prey to be included in a two-species diet,</p><p>7. The digestive rate model is modified to include simultaneous searching and handling, showing that the more they overlap, the more the predicted diet-breadth is likely to be reduced.</p>

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polypropylene sheets have been stretched at 160 °C to a state of large biaxial strain of extension ratio 3, and the stresses then allowed to relax at constant strain. The state of strain is reached via a path consisting of two sequential planar extensions, the second perpendicular to the first, under plane stress conditions with zero stress acting normal to the sheet. This strain path is highly relevant to solid phase deformation processes such as stretch blow moulding and thermoforming, and also reveals fundamental aspects of the flow rule required in the constitutive behaviour of the material. The rate of decay of stress is rapid, and such as to be highly significant in the modelling of processes that include stages of constant strain. A constitutive equation is developed that includes Eyring processes to model both the stress relaxation and strain rate dependence of the stress. The axial and transverse stresses observed during loading show that the use of a conventional Levy-Mises flow rule is ineffective, and instead a flow rule is used that takes account of the anisotropic state of the material via a power law function of the principal extension ratios. Finally the constitutive model is demonstrated to give quantitatively useful representation of the stresses both in loading and in stress relaxation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To determine differences in overall tumor responses measured by volumetric assessment and bioluminescence imaging (BLI) following exposure to uniform and non-uniform radiation fields in an ectopic prostate tumor model.

Materials and methods: Bioluminescent human prostate tumor xenografts were established by subcutaneous implantation into male mice. Tumors were irradiated with uniform or non-uniform field configurations using conventional in vivo irradiation procedures performed using a 225 kVp generator with custom lead shielding. Tumor responses were measured using Vernier calipers and by BLI using an in vivo imaging system. Survival was defined as the time to quadroupling of pre-treatment tumor volume. 

Results: The correlation between BLI and tumor volume measurements was found to be different for un-irradiated (R = 0.61), uniformly irradiated (R = 0.34) and partially irradiated (R = 0.30) tumors. Uniformly irradiated tumors resulted in an average tumor growth delay of 60 days with median survival of 75 days, compared to partially irradiated tumors which showed an average growth delay of 24 days and median survival of 38 days. 

Conclusions: Correlation between BLI and tumor volume measurements is lower for partially irradiated tumors than those exposed to uniform dose distributions. The response of partially irradiated tumors suggests non-uniformity in response beyond physical dose distribution within the target volume. Dosimetric uncertainty associated with conventional in vivo irradiation procedures prohibits their ability to accurately determine tumor response to non-uniform radiation fields and stresses the need for image guided small animal radiation research platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As an alternative to externally bonded FRP reinforcement, near-surface mounted (NSM) FRP reinforcement can be used to effectively improve the flexural performance of RC beams. In such FRP strengthened RC beams, end cover separation failure is one of the common failure modes. This failuremode involves the detachment of the NSM FRP reinforcement together with the concrete cover along the level of the tension steel reinforcement. This paper presents a new strength model for end cover separation failure in RC beams strengthened in flexure with NSM FRP strips (i.e. rectangular FRP bars with asectional height-to-thickness ratio not less than 5), which was formulated on the basis of extensive numerical results from a parametric study undertaken using an efficient finite element approach. The proposed strength model consists of an approximate equation for the debonding strain of the FRP reinforcement at the critical cracked section and a conventional section analysis to relate this debondingstrain to the moment acting on the same section (i.e. the debonding strain). Once the debonding strain is known, the load level at end cover separation of an FRP-strengthened RC beam can be easily determined for a given load distribution. Predictions from the proposed strength model are compared with those of two existing strength models of the same type and available test results, which shows that the proposed strength model is in close agreement with test results and is far more accurate than the existing strength models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transport accounts for 22% of greenhouse gas emissions in the United Kingdom and cars are expected tomore than double by 2050. Car manufacturers are continually aiming for a substantially reduced carbonfootprint through improved fuel efficiency and better powertrain performance due to the strict EuropeanUnion emissions standards. However, road tax, not just fuel efficiency, is a key consideration of consumerswhen purchasing a car. While measures have been taken to reduce emissions through stricter standards, infuture, alternative technologies will be used. Electric vehicles, hybrid vehicles and range extended electricvehicles have been identified as some of these future technologies. In this research a virtual test bed of aconventional internal combustion engine and a range extended electric vehicle family saloon car were builtin AVL’s vehicle and powertrain system level simulation tool, CRUISE, to simulate the New EuropeanDrive Cycle and the results were then soft-linked to a techno-economic model to compare the effectivenessof current support mechanisms over the full life cycle of both cars. The key finding indicates that althoughcarbon emissions are substantially reduced, switching is still not financially the best option for either theconsumer or the government in the long run.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mycotoxins and heavy metals are ubiquitous in the environment and contaminate many foods. The widespread use of pesticides in crop production to control disease contributes further to the chemical contamination of foods. Thus multiple chemical contaminants threaten the safety of many food commodities; hence the present study used maize as a model crop to identify the severity in terms of human exposure when multiple contaminants are present. High Content Analysis (HCA) measuring multiple endpoints was used to determine cytotoxicity of complex mixtures of mycotoxins, heavy metals and pesticides. Endpoints included nuclear intensity (NI), nuclear area (NA), plasma membrane permeability (PMP), mitochondrial membrane potential (MMP) and mitochondrial mass (MM). At concentrations representing legal limits of each individual contaminant in maize (3. ng/ml ochratoxin A (OTA), 1. μg/ml fumonisin B1 (FB1), 2. ng/ml aflatoxin B1 (AFB1), 100. ng/ml cadmium (Cd), 150. ng/ml arsenic (As), 50. ng/ml chlorpyrifos (CP) and 5. μg/ml pirimiphos methyl (PM), the mixtures (tertiary mycotoxins plus Cd/As) and (tertiary mycotoxins plus Cd/As/CP/PM) were cytotoxic for NA and MM endpoints with a difference of up to 13.6% (. p≤. 0.0001) and 12% (. p≤. 0.0001) respectively from control values. The most cytotoxic mixture was (tertiary mycotoxins plus Cd/As/CP/PM) across all 4 endpoints (NA, NI, MM and MMP) with increases up to 61.3%, 23.0%, 61.4% and 36.3% (. p≤. 0.0001) respectively. Synergy was evident for two endpoints (NI and MM) at concentrations contaminating maize above legal limits, with differences between expected and measured values of (6.2-12.4% (. p≤. 0.05-. p≤. 0.001) and 4.5-12.3% (. p≤. 0.05-. p≤. 0.001) for NI and MM, respectively. The study introduces for the first time, a holistic approach to identify the impact in terms of toxicity to humans when multiple chemical contaminants are present in foodstuffs. Governmental regulatory bodies must begin to contemplate how to safeguard the population when such mixtures of contaminants are found in foods and this study starts to address this critical issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conventional understandings of what the Westminster model implies anticipate reliance on a top-down, hierarchical approach to budgetary accountability, reinforced by a post–New Public Management emphasis on recentralizing administrative capacity. This article, based on a comparative analysis of the experiences of Britain and Ireland, argues that the Westminster model of bureaucratic control and oversight itself has been evolving, hastened in large part due to the global financial crisis. Governments have gained stronger controls over the structures and practices of agencies, but agencies are also key players in securing better governance outcomes. The implication is that the crisis has not seen a return to the archetypal command-and-control model, nor a wholly new implementation of negotiated European-type practices, but rather a new accountability balance between elements of the Westminster system itself that have not previously been well understood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical distributions have been extensively used in modeling fading effects in conventional and modern wireless communications. In the present work, we propose a novel κ − µ composite shadowed fading model, which is based on the valid assumption that the mean signal power follows the inverse gamma distribution instead of the lognormal or commonly used gamma distributions. This distribution has a simple relationship with the gamma distribution, but most importantly, its semi heavy-tailed characteristics constitute it suitable for applications relating to modeling of shadowed fading. Furthermore, the derived probability density function of the κ − µ / inverse gamma composite distribution admits a rather simple algebraic representation that renders it convenient to handle both analytically and numerically. The validity and utility of this fading model are demonstrated by means of modeling the fading effects encountered in body centric communications channels, which have been known to be susceptible to the shadowing effect. To this end, extensive comparisons are provided between theoretical and respective real-time measurement results. It is shown that these comparisons exhibit accurate fitting of the new model for various measurement set ups that correspond to realistic communication scenarios.