970 resultados para optimal solution


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the study of optimal control problems for the incompressible Magnetohydrodynamics (MHD) equations. Particular attention to these problems arises from several applications in science and engineering, such as fission nuclear reactors with liquid metal coolant and aluminum casting in metallurgy. In such applications it is of great interest to achieve the control on the fluid state variables through the action of the magnetic Lorentz force. In this thesis we investigate a class of boundary optimal control problems, in which the flow is controlled through the boundary conditions of the magnetic field. Due to their complexity, these problems present various challenges in the definition of an adequate solution approach, both from a theoretical and from a computational point of view. In this thesis we propose a new boundary control approach, based on lifting functions of the boundary conditions, which yields both theoretical and numerical advantages. With the introduction of lifting functions, boundary control problems can be formulated as extended distributed problems. We consider a systematic mathematical formulation of these problems in terms of the minimization of a cost functional constrained by the MHD equations. The existence of a solution to the flow equations and to the optimal control problem are shown. The Lagrange multiplier technique is used to derive an optimality system from which candidate solutions for the control problem can be obtained. In order to achieve the numerical solution of this system, a finite element approximation is considered for the discretization together with an appropriate gradient-type algorithm. A finite element object-oriented library has been developed to obtain a parallel and multigrid computational implementation of the optimality system based on a multiphysics approach. Numerical results of two- and three-dimensional computations show that a possible minimum for the control problem can be computed in a robust and accurate manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybrid vehicles (HV), comprising a conventional ICE-based powertrain and a secondary energy source, to be converted into mechanical power as well, represent a well-established alternative to substantially reduce both fuel consumption and tailpipe emissions of passenger cars. Several HV architectures are either being studied or already available on market, e.g. Mechanical, Electric, Hydraulic and Pneumatic Hybrid Vehicles. Among the others, Electric (HEV) and Mechanical (HSF-HV) parallel Hybrid configurations are examined throughout this Thesis. To fully exploit the HVs potential, an optimal choice of the hybrid components to be installed must be properly designed, while an effective Supervisory Control must be adopted to coordinate the way the different power sources are managed and how they interact. Real-time controllers can be derived starting from the obtained optimal benchmark results. However, the application of these powerful instruments require a simplified and yet reliable and accurate model of the hybrid vehicle system. This can be a complex task, especially when the complexity of the system grows, i.e. a HSF-HV system assessed in this Thesis. The first task of the following dissertation is to establish the optimal modeling approach for an innovative and promising mechanical hybrid vehicle architecture. It will be shown how the chosen modeling paradigm can affect the goodness and the amount of computational effort of the solution, using an optimization technique based on Dynamic Programming. The second goal concerns the control of pollutant emissions in a parallel Diesel-HEV. The emissions level obtained under real world driving conditions is substantially higher than the usual result obtained in a homologation cycle. For this reason, an on-line control strategy capable of guaranteeing the respect of the desired emissions level, while minimizing fuel consumption and avoiding excessive battery depletion is the target of the corresponding section of the Thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Liquids and gasses form a vital part of nature. Many of these are complex fluids with non-Newtonian behaviour. We introduce a mathematical model describing the unsteady motion of an incompressible polymeric fluid. Each polymer molecule is treated as two beads connected by a spring. For the nonlinear spring force it is not possible to obtain a closed system of equations, unless we approximate the force law. The Peterlin approximation replaces the length of the spring by the length of the average spring. Consequently, the macroscopic dumbbell-based model for dilute polymer solutions is obtained. The model consists of the conservation of mass and momentum and time evolution of the symmetric positive definite conformation tensor, where the diffusive effects are taken into account. In two space dimensions we prove global in time existence of weak solutions. Assuming more regular data we show higher regularity and consequently uniqueness of the weak solution. For the Oseen-type Peterlin model we propose a linear pressure-stabilized characteristics finite element scheme. We derive the corresponding error estimates and we prove, for linear finite elements, the optimal first order accuracy. Theoretical error of the pressure-stabilized characteristic finite element scheme is confirmed by a series of numerical experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Switching mode power supplies (SMPS) are subject to low power factor and high harmonic distortions. Active power-factor correction (APFC) is a technique to improve the power factor and to reduce the harmonic distortion of SMPSs. However, this technique results in double frequency output voltage variation which can be reduced by using a large output capacitance. Using large capacitors increases the cost and size of the converter. Furthermore, the capacitors are subject to frequent failures mainly caused by evaporation of the electrolytic solution which reduce the converter reliability. This thesis presents an optimal control method for the input current of a boost converter to reduce the size of the output capacitor. The optimum current waveform as a function of weighing factor is found by using the Euler Lagrange equation. A set of simulations are performed to determine the ideal weighing which gives the lowest possible output voltage variation as the converter still meets the IEC-61000-3-2 class-A harmonics requirements with a power factor of 0.8 or higher. The proposed method is verified by the experimental work. A boost converter is designed and it is run for different power levels, 100 W, 200 W and 400 W. The desired output voltage ripple is 10 V peak to peak for the output voltage of 200 Vdc. This ripple value corresponds to a ± 2.5% output voltage ripple. The experimental and the simulation results are found to be quite matching. A significant reduction in capacitor size, as high as 50%, is accomplished by using the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water resource depletion and sanitation are growing problems around the world. A solution to both of these problems is the use of composting latrines, as it requires no water and has been recommended by the World Health Organization as an improved sanitation technology. However, little analysis has been done on the decomposition process occurring inside the latrine, including what temperatures are reached and what variables most affect the composting process. Having better knowledge of how outside variables affect composting latrines can aid development workers on the choice of implementing such technology, and to better educate the users on the appropriate methods of maintenance. This report presents a full, detailed construction manual and temperature data analysis of a double vault composting latrine. During the author’s two year Peace Corps service in rural Paraguay he was involved with building twenty one composting latrines, and took detailed temperature readings and visual observations of his personal latrine for ten months. The author also took limited temperature readings of fourteen community member’s latrines over a three month period. These data points were analyzed to find correlations between compost temperatures and several variables. The two main variables found to affect the compost temperatures were the seasonal trends of the outside temperatures, and the mixing and addition of moisture to the compost. Outside seasonal temperature changes were compared to those of the compost and a linear regression was performed resulting in a R2-value of 0.89. Mixing the compost and adding water, or a water/urine mixture, resulted in temperature increases of the compost 100% of the time, with seasonal temperatures determining the rate and duration of the temperature increases. The temperature readings were also used to find events when certain temperatures were held for sufficient amounts of time to reach total pathogen destruction in the compost. Four different events were recorded when a temperature of 122°F (50°C) was held for at least 24 hours, ensuring total pathogen destruction in that area of the compost. One event of 114.8°F (46°C) held for one week was also recorded, again ensuring total pathogen destruction. Through the analysis of the temperature data, however, it was found that the compost only reached total pathogen destruction levels during ten percent of the data points. Because of this the storage time recommendation outlined by the World Health Organization should be complied with. The WHO recommends storing compost for 1.5-2 years in climates with ambient temperatures of 2-20°C (35-68°F), and for at least 1 year with ambient temperatures of 20-35°C (68-95°F). If these storage durations are obtainable the use of the double vault composting latrine is an economical and achievable solution to sanitation while conserving water resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Component commonality - the use of the same version of a component across multiple products - is being increasingly considered as a promising way to offer high external variety while retaining low internal variety in operations. However, increasing commonality has both positive and negative cost effects, so that optimization approaches are required to identify an optimal commonality level. As components influence to a greater or lesser extent nearly every process step along the supply chain, it is not surprising that a multitude of diverging commonality problems is being investigated in literature, each of which are developing a specific algorithm designed for the respective commonality problem being considered. The paper on hand aims at a general framework which is flexible and efficient enough to be applied to a wide range of commonality problems. Such a procedure based on a two-stage graph approach is presented and tested. Finally, flexibility of the procedure is shown by customizing the framework to account for different types of commonality problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The execution of a project requires resources that are generally scarce. Classical approaches to resource allocation assume that the usage of these resources by an individual project activity is constant during the execution of that activity; in practice, however, the project manager may vary resource usage over time within prescribed bounds. This variation gives rise to the project scheduling problem which consists in allocating the scarce resources to the project activities over time such that the project duration is minimized, the total number of resource units allocated equals the prescribed work content of each activity, and various work-content-related constraints are met. We formulate this problem for the first time as a mixed-integer linear program. Our computational results for a standard test set from the literature indicate that this model outperforms the state-of-the-art solution methods for this problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The optimal crystalloid solution to use perioperatively in patients undergoing open radical cystectomy remains unclear. Many of the fluids used for intravenous hydration contain supraphysiologic concentrations of chloride, which can induce hyperchloremia and metabolic acidosis, resulting in renal vasoconstriction and decreased renal function. In addition, patients receiving less fluid and less sodium show faster recovery of gastrointestinal (GI) function after colonic surgery. METHODS AND DESIGN This is an investigator-initiated, single-center, randomized, controlled, parallel group trial with assessor-blinded outcome assessment, in the Department of Urology, University Hospital Bern, Switzerland. The study will involve 44 patients with bladder cancer scheduled for radical cystectomy and urinary diversion. The primary outcome is the duration between the end of surgery and the return of the GI function (first defecation). Secondary outcomes are fluid balance (body weight difference postoperatively versus preoperatively) and the incidence of kidney function disorders according to the Risk-Injury-Failure-Loss-End Stage Renal Disease (RIFLE classification). An equal number of patients are allocated to receive Ringerfundin® solution or a glucose/potassium-based balanced crystalloid solution as baseline infusion during the entire time that intravenous administration of fluid is necessary during the perioperative period. The randomized crystalloid solution is infused at a rate of 1 ml/kg/h until the bladder has been removed, followed by 3 ml/kg/h until the end of surgery. Postoperative hydration is identical in both groups and consists of 1,500 ml of the randomized crystalloid solution per 24 hours. Postoperative patient care is identical in both groups; patients are allowed to drink clear fluids immediately after surgery, and liquid diet is started on postoperative day 1, as well as active mobilization and the use of chewing gum. Body weight is measured daily in the morning. Time of first flatus and first defecation are recorded. DISCUSSION This trial assesses the benefits and harms of two different balanced crystalloid solutions for perioperative fluid management in patients undergoing open radical cystectomy with urinary diversion, with regard to return of GI function and effects on postoperative renal function. TRIAL REGISTRATION Current Controlled Trials ISRCTN32976792 (registered on November 21 2013).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Center for Orbit Determination in Europe (CODE) is contributing as a global Analysis center to the International GNSS Service (IGS) since many years. The processing of GPS and GLONASS data is well established in CODE’s ultra-rapid, rapid, and final product lines. With the introduction of new signals for the established and new GNSS, new challenges and opportunities are arising for the GNSS data management and processing. The IGS started the Multi-GNSS-EXperiment (MGEX) in 2012 in order to gain first experience with the new data formats and to develop new strategies for making optimal use of these additional measurements. CODE has started to contribute to IGS MGEX with a consistent, rigorously combined triple-system orbit solution (GPS, GLONASS, and Galileo). SLR residuals for the computed Galileo satellite orbits are of the order of 10 cm. Furthermore CODE established a GPS and Galileo clock solution. A quality assessment shows that these experimental orbit and clock products allow even a Galileo-only precise point positioning (PPP) with accuracies on the decimeter- (static PPP) to meter-level (kinematic PPP) for selected stations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capital structure and regulation of financial intermediaries is an important topic for practitioners, regulators and academic researchers. In general, theory predicts that firms choose their capital structures by balancing the benefits of debt (e.g., tax and agency benefits) against its costs (e.g., bankruptcy costs). However, when traditional corporate finance models have been applied to insured financial institutions, the results have generally predicted corner solutions (all equity or all debt) to the capital structure problem. This paper studies the impact and interaction of deposit insurance, capital requirements and tax benefits on a bankÇs choice of optimal capital structure. Using a contingent claims model to value the firm and its associated claims, we find that there exists an interior optimal capital ratio in the presence of deposit insurance, taxes and a minimum fixed capital standard. Banks voluntarily choose to maintain capital in excess of the minimum required in order to balance the risks of insolvency (especially the loss of future tax benefits) against the benefits of additional debt. Because we derive a closed- form solution, our model provides useful insights on several current policy debates including revisions to the regulatory framework for GSEs, tax policy in general and the tax exemption for credit unions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the central bank should adopt a loss function that differs from the social loss function. Carefully designing the central bank s loss function with consistent targets can harmonize optimal and consistent policy. This desirable result emerges from two observations. First, the social loss function reflects a normative process that does not necessarily prove consistent with the structure of the microeconomy. Thus, the social loss function cannot serve as a direct loss function for the central bank. Second, an optimal loss function for the central bank must depend on the structure of that microeconomy. In addition, this paper shows that control theory provides a benchmark for institution design in a game-theoretical framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows that optimal policy and consistent policy outcomes require the use of control-theory and game-theory solution techniques. While optimal policy and consistent policy often produce different outcomes even in a one-period model, we analyze consistent policy and its outcome in a simple model, finding that the cause of the inconsistency with optimal policy traces to inconsistent targets in the social loss function. As a result, the social loss function cannot serve as a direct loss function for the central bank. Accordingly, we employ implementation theory to design a central bank loss function (mechanism design) with consistent targets, while the social loss function serves as a social welfare criterion. That is, with the correct mechanism design for the central bank loss function, optimal policy and consistent policy become identical. In other words, optimal policy proves implementable (consistent).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The demand of video contents has rapidly increased in the past years as a result of the wide deployment of IPTV and the variety of services offered by the network operators. One of the services that has especially become attractive to the customers is real-time video on demand (VoD) because it offers an immediate streaming of a large variety of video contents. The price that the operators have to pay for this convenience is the increased traffic in the networks, which are becoming more congested due to the higher demand for VoD contents and the increased quality of the videos. As a solution, in this paper we propose a hierarchical network system for VoD content delivery in managed networks, which implements redistribution algorithm and a redirection strategy for optimal content distribution within the network core and optimal streaming to the clients. The system monitors the state of the network and the behavior of the users to estimate the demand for the content items and to take the right decision on the appropriate number of replicas and their best positions in the network. The system's objectives are to distribute replicas of the content items in the network in a way that the most demanded contents will have replicas closer to the clients so that it will optimize the network utilization and will improve the users' experience. It also balances the load between the servers concentrating the traffic to the edges of the network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, a new two-dimensional optics design method is proposed that enables the coupling of three ray sets with two lens surfaces. The method is especially important for optical systems designed for wide field of view and with clearly separated optical surfaces. Fermat’s principle is used to deduce a set of functional differential equations fully describing the entire optical system. The presented general analytic solution makes it possible to calculate the lens profiles. Ray tracing results for calculated 15th order Taylor polynomials describing the lens profiles demonstrate excellent imaging performance and the versatility of this new analytic design method.