996 resultados para BALANCING PROBLEM


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common mode voltage (CMV) variations in PWM inverter-fed drives generate unwanted shaft and bearing current resulting in early motor failure. Multilevel inverters reduce this problem to some extent, with higher number of levels. But the complexity of the power circuit increases with an increase in the number of inverter voltage levels. In this paper a five-level inverter structure is proposed for open-end winding induction motor (IM) drives, by cascading only two conventional two-level and three-level inverters, with the elimination of the common mode voltage over the entire modulation range. The DC link power supply requirement is also optimized by means of DC link capacitor voltage balancing, with PWM control., using only inverter switching state redundancies. The proposed power circuit gives a simple power bits structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model comprising several servers, each equipped with its own queue and with possibly different service speeds, is considered. Each server receives a dedicated arrival stream of jobs; there is also a stream of generic jobs that arrive to a job scheduler and can be individually allocated to any of the servers. It is shown that if the arrival streams are all Poisson and all jobs have the same exponentially distributed service requirements, the probabilistic splitting of the generic stream that minimizes the average job response time is such that it balances the server idle times in a weighted least-squares sense, where the weighting coefficients are related to the service speeds of the servers. The corresponding result holds for nonexponentially distributed service times if the service speeds are all equal. This result is used to develop adaptive quasi-static algorithms for allocating jobs in the generic arrival stream when the load parameters are unknown. The algorithms utilize server idle-time measurements which are sent periodically to the central job scheduler. A model is developed for these measurements, and the result mentioned is used to cast the problem into one of finding a projection of the root of an affine function, when only noisy values of the function can be observed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Common mode voltage (CMV) variations in PWM inverter-fed drives generate unwanted shaft and bearing current resulting in early motor failure. Multilevel inverters reduce this problem to some extent, with higher number of levels. But the complexity of the power circuit increases with an increase in the number of inverter voltage levels. In this paper a five-level inverter structure is proposed for open-end winding induction motor (IM) drives, by cascading only two conventional two-level and three-level inverters, with the elimination of the common mode voltage over the entire modulation range. The DC link power supply requirement is also optimized by means of DC link capacitor voltage balancing, with PWM control, using only inverter switching state redundancies. The proposed power circuit gives a simple power bus structure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we investigate a numerical method for the solution of an inverse problem of recovering lacking data on some part of the boundary of a domain from the Cauchy data on other part for a variable coefficient elliptic Cauchy problem. In the process, the Cauchy problem is transformed into the problem of solving a compact linear operator equation. As a remedy to the ill-posedness of the problem, we use a projection method which allows regularization solely by discretization. The discretization level plays the role of regularization parameter in the case of projection method. The balancing principle is used for the choice of an appropriate discretization level. Several numerical examples show that the method produces a stable good approximate solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multilevel algorithms are a successful class of optimization techniques which addresses the mesh partitioning problem. They usually combine a graph contraction algorithm together with a local optimization method which refines the partition at each graph level. In this paper we present an enhancement of the technique which uses imbalance to achieve higher quality partitions. We also present a formulation of the Kernighan-Lin partition optimization algorithm which incorporates load-balancing. The resulting algorithm is tested against a different but related state-of-the-art partitioner and shown to provide improved results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As medical technology has advanced, so too have our attitudes towards the level of control we can or should expect to have over our procreative capacities. This creates a multidimensional problem for the law and family planning services in terms of access to services – whether to avoid conception or terminate a pregnancy – and the negligent provision of these services. These developments go to the heart of our perception of autonomy. Unsurprisingly, these matters also raise a moral dilemma for the law. Distinctively, discourse in this area is dominated by assertions of subjective moral value; in relation to life, to personal choice and to notions of the archetypal family. Against this, I stress that a model of objective morality can answer these challenging questions and resolve the inherent problems of legal regulation. Therefore, I argue that notions of autonomy must be based on a rational, action-based understanding of what it means to be a ‘moral agent’. I claim that from this we might support a legal standard, based on objective rational morality, which can frame our constitutional norms and our conception of justice in these contentious areas. This paper claims that the current regulation of abortion is outdated and requires radical reform. It proposes a scheme that would shift the choice towards the mother (and the father), remove the unnecessarily broad disability ground and involve doctors having a role of counsel (rather than gatekeeper).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In molecular biology, it is often desirable to find common properties in large numbers of drug candidates. One family of methods stems from the data mining community, where algorithms to find frequent graphs have received increasing attention over the past years. However, the computational complexity of the underlying problem and the large amount of data to be explored essentially render sequential algorithms useless. In this paper, we present a distributed approach to the frequent subgraph mining problem to discover interesting patterns in molecular compounds. This problem is characterized by a highly irregular search tree, whereby no reliable workload prediction is available. We describe the three main aspects of the proposed distributed algorithm, namely, a dynamic partitioning of the search space, a distribution process based on a peer-to-peer communication framework, and a novel receiverinitiated load balancing algorithm. The effectiveness of the distributed method has been evaluated on the well-known National Cancer Institute’s HIV-screening data set, where we were able to show close-to linear speedup in a network of workstations. The proposed approach also allows for dynamic resource aggregation in a non dedicated computational environment. These features make it suitable for large-scale, multi-domain, heterogeneous environments, such as computational grids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acrylamide, a chemical that is probably carcinogenic in humans and has neurological and reproductive effects, forms from free asparagine and reducing sugars during high-temperature cooking and processing of common foods. Potato and cereal products are major contributors to dietary exposure to acrylamide and while the food industry reacted rapidly to the discovery of acrylamide in some of the most popular foods, the issue remains a difficult one for many sectors. Efforts to reduce acrylamide formation would be greatly facilitated by the development of crop varieties with lower concentrations of free asparagine and/or reducing sugars, and of best agronomic practice to ensure that concentrations are kept as low as possible. This review describes how acrylamide is formed, the factors affecting free asparagine and sugar concentrations in crop plants, and the sometimes complex relationship between precursor concentration and acrylamide-forming potential. It covers some of the strategies being used to reduce free asparagine and sugar concentrations through genetic modification and other genetic techniques, such as the identification of quantitative trait loci. The link between acrylamide formation, flavour, and colour is discussed, as well as the difficulty of balancing the unknown risk of exposure to acrylamide in the levels that are present in foods with the well-established health benefits of some of the foods concerned. Key words: Amino acids, asparagine, cereals, crop quality, food safety, Maillard reaction, potato, rye, sugars, wheat.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we look at the persistent debate over notions of access and excellence and intrinsic and instrumentalist rationales for arts practice within cultural policy discussion. Recent research into the Indigenous performing arts in Australia underlines the particular difficulties faced by the sector in balancing the demands of community participation, social inclusion and high-quality aesthetic outcomes. The balancing act has proven unsustainable for some Indigenous performing arts companies and their viability is now in doubt. This suggests that a re-consideration of the question of the purpose and value of the Indigenous performing arts is timely.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Building Code of Australia seeks to establish “nationally consistent, minimum necessary standards of relevant, health, safety (including structural safety and safety from fire), amenity and sustainability objectives efficiently”. These goals are laudable – but where are the goals of quality and maintenance, which are also an essential part of achieving adequate and continuing health and safety for the built environment?

Defects such as dampness, settlement and cracking, staining, wood rot, termite damage, rusting, and roof leakage are common enough to suggest that there are still issues with building quality in housing. They are caused by a combination of initial poor workmanship and poor quality materials and latterly by poorly executed or inadequate maintenance.

Local architecture, developed over many years of trial and error, produce buildings linked to their climate and local materials (think of the typical “Queenslander” house). Today’s architecture imports technologies and materials from many differing countries and climates – that are not necessarily suitable for the location, nor is there necessarily the same quality control over the material quality and production. Inappropriate use and inadequate understanding of new materials and techniques can lead to the generation of further defects.

Whilst the building code contains provisions for initial-build material quality and workmanship, there is no continuing control over a house over its life span. Reliance is placed on advertising the need, for example, to employ qualified tradespeople; replace batteries in smoke detectors; and other good advice to help maintain housing to a minimum standard. Is this sufficient?

Mechanisms to make the transfer of knowledge to those who need to use it – be it the workforce or the houseowner – need to be improved. Should the building code be more visual and accessible in it’s content? Should the building code include provisions for maintenance? Should the building code require every house to have a “users manual” – much like a car? An extensive review of literature identifies the scale of the problem of poor quality housing and highlights some suggested causes – inadequate knowledge of the BCA by general housebuilders being one. However little work has been done to investigate what could be done to improve the situation. This work suggests that improvements to knowledge transfer would improve the quality of housing and a model of the knowledge transfer process is proposed, identifying those areas where the knowledge flows need to occur that would impact both the builders and users of housing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The next-generation SONET metro network is evolving into a service-rich infrastructure. At the edge of such a network, multi-service provisioning platforms (MSPPs) provide efficient data mapping enabled by Generic Framing Procedure (GFP) and Virtual Concatenation (VC). The core of the network tends to be a meshed architecture equipped with Multi-Service Switches (MSSs). In the context of these emerging technologies, we propose a load-balancing spare capacity reallocation approach to improve network utilization in the next-generation SONET metro networks. Using our approach, carriers can postpone network upgrades, resulting in increased revenue with reduced capital expenditures (CAPEX). For the first time, we consider the spare capacity reallocation problem from a capacity upgrade and network planning perspective. Our approach can operate in the context of shared-path protection (with backup multiplexing) because it reallocates spare capacity without disrupting working services. Unlike previous spare capacity reallocation approaches which aim at minimizing total spare capacity, our load-balancing approach minimizes the network load vector (NLV), which is a novel metric that reflects the network load distribution. Because NLV takes into consideration both uniform and non-uniform link capacity distribution, our approach can benefit both uniform and non-uniform networks. We develop a greedy loadbalancing spare capacity reallocation (GLB-SCR) heuristic algorithm to implement this approach. Our experimental results show that GLB-SCR outperforms a previously proposed algorithm (SSR) in terms of established connection capacity and total network capacity in both uniform and non-uniform networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental gap in the current understanding of collapsed structures in the universe concerns the thermodynamical evolution of the ordinary, baryonic component. Unopposed radiative cooling of plasma would lead to the cooling catastrophe, a massive inflow of condensing gas toward the centre of galaxies, groups and clusters. The last generation of multiwavelength observations has radically changed our view on baryons, suggesting that the heating linked to the active galactic nucleus (AGN) may be the balancing counterpart of cooling. In this Thesis, I investigate the engine of the heating regulated by the central black hole. I argue that the mechanical feedback, based on massive subrelativistic outflows, is the key to solving the cooling flow problem, i.e. dramatically quenching the cooling rates for several billion years without destroying the cool-core structure. Using an upgraded version of the parallel 3D hydrodynamic code FLASH, I show that anisotropic AGN outflows can further reproduce fundamental observed features, such as buoyant bubbles, cocoon shocks, sonic ripples, metals dredge-up, and subsonic turbulence. The latter is an essential ingredient to drive nonlinear thermal instabilities, which cause cold gas condensation, a residual of the quenched cooling flow and, later, fuel for the AGN feedback engine. The self-regulated outflows are systematically tested on the scales of massive clusters, groups and isolated elliptical galaxies: in lighter less bound objects the feedback needs to be gentler and less efficient, in order to avoid drastic overheating. In this Thesis, I describe in depth the complex hydrodynamics, involving the coupling of the feedback energy to that of the surrounding hot medium. Finally, I present the merits and flaws of all the proposed models, with a critical eye toward observational concordance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic adaptation to different environmental conditions is expected to lead to large differences between populations at selected loci, thus providing a signature of positive selection. Whereas balancing selection can maintain polymorphisms over long evolutionary periods and even geographic scale, thus leads to low levels of divergence between populations at selected loci. However, little is known about the relative importance of these two selective forces in shaping genomic diversity, partly due to difficulties in recognizing balancing selection in species showing low levels of differentiation. Here we address this problem by studying genomic diversity in the European common vole (Microtus arvalis) presenting high levels of differentiation between populations (average FST = 0.31). We studied 3,839 Amplified Fragment Length Polymorphism (AFLP) markers genotyped in 444 individuals from 21 populations distributed across the European continent and hence over different environmental conditions. Our statistical approach to detect markers under selection is based on a Bayesian method specifically developed for AFLP markers, which treats AFLPs as a nearly codominant marker system, and therefore has increased power to detect selection. The high number of screened populations allowed us to detect the signature of balancing selection across a large geographic area. We detected 33 markers potentially under balancing selection, hence strong evidence of stabilizing selection in 21 populations across Europe. However, our analyses identified four-times more markers (138) being under positive selection, and geographical patterns suggest that some of these markers are probably associated with alpine regions, which seem to have environmental conditions that favour adaptation. We conclude that despite favourable conditions in this study for the detection of balancing selection, this evolutionary force seems to play a relatively minor role in shaping the genomic diversity of the common vole, which is more influenced by positive selection and neutral processes like drift and demographic history.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The capital structure and regulation of financial intermediaries is an important topic for practitioners, regulators and academic researchers. In general, theory predicts that firms choose their capital structures by balancing the benefits of debt (e.g., tax and agency benefits) against its costs (e.g., bankruptcy costs). However, when traditional corporate finance models have been applied to insured financial institutions, the results have generally predicted corner solutions (all equity or all debt) to the capital structure problem. This paper studies the impact and interaction of deposit insurance, capital requirements and tax benefits on a bankÇs choice of optimal capital structure. Using a contingent claims model to value the firm and its associated claims, we find that there exists an interior optimal capital ratio in the presence of deposit insurance, taxes and a minimum fixed capital standard. Banks voluntarily choose to maintain capital in excess of the minimum required in order to balance the risks of insolvency (especially the loss of future tax benefits) against the benefits of additional debt. Because we derive a closed- form solution, our model provides useful insights on several current policy debates including revisions to the regulatory framework for GSEs, tax policy in general and the tax exemption for credit unions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multilevel algorithms are a successful class of optimisation techniques which address the mesh partitioning problem. They usually combine a graph contraction algorithm together with a local optimisation method which refines the partition at each graph level. In this paper we present an enhancement of the technique which uses imbalance to achieve higher quality partitions. We also present a formulation of the Kernighan-Lin partition optimisation algorithm which incorporates load-balancing. The resulting algorithm is tested against a different but related state-of the-art partitioner and shown to provide improved results.