889 resultados para minimum coverage requirement


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new penalty-based genetic algorithm for the multi-source and multi-sink minimum vertex cut problem, and illustrate the algorithm’s usefulness with two real-world applications. It is proved in this paper that the genetic algorithm always produces a feasible solution by exploiting some domain-specific knowledge. The genetic algorithm has been implemented on the example applications and evaluated to show how well it scales as the problem size increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Here we search for evidence of the existence of a sub-chondritic 142Nd/144Nd reservoir that balances the Nd isotope chemistry of the Earth relative to chondrites. If present, it may reside in the source region of deeply sourced mantle plume material. We suggest that lavas from Hawai’i with coupled elevations in 186Os/188Os and 187Os/188Os, from Iceland that represent mixing of upper mantle and lower mantle components, and from Gough with sub-chondritic 143Nd/144Nd and high 207Pb/206Pb, are favorable samples that could reflect mantle sources that have interacted with an Early-Enriched Reservoir (EER) with sub-chondritic 142Nd/144Nd. High-precision Nd isotope analyses of basalts from Hawai’i, Iceland and Gough demonstrate no discernable 142Nd/144Nd deviation from terrestrial standards. These data are consistent with previous high-precision Nd isotope analysis of recent mantle-derived samples and demonstrate that no mantle-derived material to date provides evidence for the existence of an EER in the mantle. We then evaluate mass balance in the Earth with respect to both 142Nd/144Nd and 143Nd/144Nd. The Nd isotope systematics of EERs are modeled for different sizes and timing of formation relative to ε143Nd estimates of the reservoirs in the μ142Nd = 0 Earth, where μ142Nd is ((measured 142Nd/144Nd/terrestrial standard 142Nd/144Nd)−1 * 10−6) and the μ142Nd = 0 Earth is the proportion of the silicate Earth with 142Nd/144Nd indistinguishable from the terrestrial standard. The models indicate that it is not possible to balance the Earth with respect to both 142Nd/144Nd and 143Nd/144Nd unless the μ142Nd = 0 Earth has a ε143Nd within error of the present-day Depleted Mid-ocean ridge basalt Mantle source (DMM). The 4567 Myr age 142Nd–143Nd isochron for the Earth intersects μ142Nd = 0 at ε143Nd of +8 ± 2 providing a minimum ε143Nd for the μ142Nd = 0 Earth. The high ε143Nd of the μ142Nd = 0 Earth is confirmed by the Nd isotope systematics of Archean mantle-derived rocks that consistently have positive ε143Nd. If the EER formed early after solar system formation (0–70 Ma) continental crust and DMM can be complementary reservoirs with respect to Nd isotopes, with no requirement for significant additional reservoirs. If the EER formed after 70 Ma then the μ142Nd = 0 Earth must have a bulk ε143Nd more radiogenic than DMM and additional high ε143Nd material is required to balance the Nd isotope systematics of the Earth.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes protection and control of a microgrid with converter interfaced micro sources. The proposed protection and control scheme consider both grid connected and autonomous operation of the microgrid. A protection scheme, capable of detecting faults effectively in both grid connected and islanded operations is proposed. The main challenge of the protection, due to current limiting state of the converters is overcome by using admittance relays. The relays operate according to the inverse time characteristic based on measured admittance of the line. The proposed scheme isolates the fault from both sides, while downstream side of the microgrid operates in islanding condition. Moreover faults can be detected in autonomous operation. In grid connected mode distributed generators (DG) supply the rated power while in absence of the grid, DGs share the entire power requirement proportional to rating based on output voltage angle droop control. The protection scheme ensures minimum load shedding with isolating the faulted network and DG control provides a smooth islanding and resynchronization operation. The efficacy of coordinated control and protection scheme has been validated through simulation for various operating conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present a finite sample analysis of the sample minimum-variance frontier under the assumption that the returns are independent and multivariate normally distributed. We show that the sample minimum-variance frontier is a highly biased estimator of the population frontier, and we propose an improved estimator of the population frontier. In addition, we provide the exact distribution of the out-of-sample mean and variance of sample minimum-variance portfolios. This allows us to understand the impact of estimation error on the performance of in-sample optimal portfolios. Key Words: minimum-variance frontier; efficiency set constants; finite sample distribution

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the patterns of television news coverage of the political parties, their leaders and the issues they raised during the 2001 Australian federal election campaign. By focusing on some issues, parties and leaders, television has long been argued to constrain voters' evaluations. We find that television news coverage in the 2001 Australian election campaign focused primarily on international issues, especially terrorism and asylum seekers, and on the two major parties - virtually to the exclusion of coverage of the minor parties and their leaders. Within the major party 'two-horse race', television gave substantially more coverage to the leaders than to the parties themselves, thereby sustaining what some have called a 'presidential'-style political contest. John Howard emerged as the winner in the leaders' stakes, garnering more coverage than Labor's Kim Beazley.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background The preservation of meniscal tissue is important to protect joint surfaces. Purpose We have an aggressive approach to meniscal repair, including repairing tears other than those classically suited to repair. Here we present the medium- to long-term outcome of meniscal repair (inside-out) in elite athletes. Study Design Case series; Level of evidence, 4. Methods Forty-two elite athletes underwent 45 meniscal repairs. All repairs were performed using an arthroscopically assisted inside-out technique. Eighty-three percent of these athletes had ACL reconstruction at the same time. Patients returned a completed questionnaire (including Lysholm and International Knee Documentation Committee [IKDC] scores). Mean follow-up was 8.5 years. Failure was defined by patients developing symptoms of joint line pain and/or locking or swelling requiring repeat arthroscopy and partial meniscectomy. Results The average Lysholm and subjective IKDC scores were 89.6 and 85.4, respectively. Eighty-one percent of patients returned to their main sport and most to a similar level at a mean time of 10.4 months after repair, reflecting the high level of ACL reconstruction in this group. We identified 11 definite failures, 10 medial and 1 lateral meniscus, that required excision; this represents a 24% failure rate. We identified 1 further patient who had possible failed repairs, giving a worst-case failure rate of 26.7% at a mean of 42 months after surgery. However, 7 of these failures were associated with a further injury. Therefore, the atraumatic failure rate was 11%. Age and size and location of the tears were not associated with a higher failure rate. Medial meniscal repairs were significantly more likely to fail than lateral meniscal repairs, with a failure rate of 36.4% and 5.6%, respectively (P < .05). Conclusion Meniscal repair and healing are possible, and most elite athletes can return to their preinjury level of activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates a wireless sensor network deployment - monitoring water quality, e.g. salinity and the level of the underground water table - in a remote tropical area of northern Australia. Our goal is to collect real time water quality measurements together with the amount of water being pumped out in the area, and investigate the impacts of current irrigation practice on the environments, in particular underground water salination. This is a challenging task featuring wide geographic area coverage (mean transmission range between nodes is more than 800 meters), highly variable radio propagations, high end-to-end packet delivery rate requirements, and hostile deployment environments. We have designed, implemented and deployed a sensor network system, which has been collecting water quality and flow measurements, e.g., water flow rate and water flow ticks for over one month. The preliminary results show that sensor networks are a promising solution to deploying a sustainable irrigation system, e.g., maximizing the amount of water pumped out from an area with minimum impact on water quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical infrastructure assets are important components of our society and our economy. They are usually designed to last for many years, are expected to be heavily used during their lifetime, carry considerable load, and are exposed to the natural environment. They are also normally major structures, and therefore present a heavy investment, requiring constant management over their life cycle to ensure that they perform as required by their owners and users. Given a complex and varied infrastructure life cycle, constraints on available resources, and continuing requirements for effectiveness and efficiency, good management of infrastructure is important. While there is often no one best management approach, the choice of options is improved by better identification and analysis of the issues, by the ability to prioritise objectives, and by a scientific approach to the analysis process. The abilities to better understand the effect of inputs in the infrastructure life cycle on results, to minimise uncertainty, and to better evaluate the effect of decisions in a complex environment, are important in allocating scarce resources and making sound decisions. Through the development of an infrastructure management modelling and analysis methodology, this thesis provides a process that assists the infrastructure manager in the analysis, prioritisation and decision making process. This is achieved through the use of practical, relatively simple tools, integrated in a modular flexible framework that aims to provide an understanding of the interactions and issues in the infrastructure management process. The methodology uses a combination of flowcharting and analysis techniques. It first charts the infrastructure management process and its underlying infrastructure life cycle through the time interaction diagram, a graphical flowcharting methodology that is an extension of methodologies for modelling data flows in information systems. This process divides the infrastructure management process over time into self contained modules that are based on a particular set of activities, the information flows between which are defined by the interfaces and relationships between them. The modular approach also permits more detailed analysis, or aggregation, as the case may be. It also forms the basis of ext~nding the infrastructure modelling and analysis process to infrastructure networks, through using individual infrastructure assets and their related projects as the basis of the network analysis process. It is recognised that the infrastructure manager is required to meet, and balance, a number of different objectives, and therefore a number of high level outcome goals for the infrastructure management process have been developed, based on common purpose or measurement scales. These goals form the basis of classifYing the larger set of multiple objectives for analysis purposes. A two stage approach that rationalises then weights objectives, using a paired comparison process, ensures that the objectives required to be met are both kept to the minimum number required and are fairly weighted. Qualitative variables are incorporated into the weighting and scoring process, utility functions being proposed where there is risk, or a trade-off situation applies. Variability is considered important in the infrastructure life cycle, the approach used being based on analytical principles but incorporating randomness in variables where required. The modular design of the process permits alternative processes to be used within particular modules, if this is considered a more appropriate way of analysis, provided boundary conditions and requirements for linkages to other modules, are met. Development and use of the methodology has highlighted a number of infrastructure life cycle issues, including data and information aspects, and consequences of change over the life cycle, as well as variability and the other matters discussed above. It has also highlighted the requirement to use judgment where required, and for organisations that own and manage infrastructure to retain intellectual knowledge regarding that infrastructure. It is considered that the methodology discussed in this thesis, which to the author's knowledge has not been developed elsewhere, may be used for the analysis of alternatives, planning, prioritisation of a number of projects, and identification of the principal issues in the infrastructure life cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the rate-based flow control for ATM Available Bit Rate service, fairness is an important requirement, i.e. each flow should be allocated a fair share of the available bandwidth in the network. Max–min fairness, which is widely adopted in ATM, is appropriate only when the minimum cell rates (MCRs) of the flows are zero or neglected. Generalised max–min (GMM) fairness extends the principle of the max–min fairness to accommodate MCR. In this paper, we will discuss the formulation of the GMM fair rate allocation, propose a centralised algorithm, analyse its bottleneck structure and develop an efficient distributed explicit rate allocation algorithm to achieve the GMM fairness in an ATM network. The study in this paper addresses certain theoretical and practical issues of the GMM fair rate allocation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The focus of this paper is preparing research for dissemination by mainstream print, broadcast, and online media. While the rise of the blogosphere and social media is proving an effective way of reaching niche audiences, my own research reached such an audience through traditional media. The first major study of Australian horror cinema, my PhD thesis A Dark New World: Anatomy of Australian Horror Films, generated strong interest from horror movie fans, film scholars, and filmmakers. I worked closely with the Queensland University of Technology’s (QUT) public relations unit to write two separate media releases circulated on October 13, 2008 and October 14, 2009. This chapter reflects upon the process of working with the media and provides tips for reaching audiences, particularly in terms of strategically planning outcomes. It delves into the background of my study which would later influence my approach to the media, the process of drafting media releases, and key outcomes and benefits from popularising research. A key lesson from this experience is that redeveloping research for the media requires a sharp writing style, letting go of academic justification, catchy quotes, and an ability to distil complex details into easy-to-understand concepts. Although my study received strong media coverage, and I have since become a media commentator, my experiences also revealed a number of pitfalls that are likely to arise for other researchers keen on targeting media coverage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among the many factors that influence enforcement agencies, this article examines the role of the institutional location (and independence) of agencies, and an incumbent government's ideology. It is argued that institutional location affects the level of political influence on the agency's operations, while government ideology affects its willingness to resource enforcement agencies and approve regulatory activities. Evidence from the agency regulating minimum labour standards in the Australian federal industrial relations jurisdiction (currently the Fair Work Ombudsman) highlights two divergences from the regulatory enforcement literature generally. First, notions of independence from political interference offered by institutional location are more illusory than real and, second, political need motivates political action to a greater extent than political ideology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Section 126 of the Land Title Act 1994 (Qld) regulates whether, and if so, when a caveat will lapse. While certain caveats will not lapse due to the operation of s 126(1), if a caveator does not wish a caveat to which the section applies to lapse, the caveator must start a proceeding in a court of competent jurisdiction to establish the interest claimed under the caveat within the time limits specified in, and otherwise comply with the obligations imposed by, s 126(4). The requirement, in s 126(4), to “start a proceeding” was the subject of judicial examination by the Court of Appeal (McMurdo P, Holmes JA and MacKenzie J) in Cousins Securities Pty Ltd v CEC Group Ltd [2007] QCA 192.