785 resultados para Asymptotically Good Tower


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Murphy, L., Lewandowski, G., McCauley, R., Simon, B., Thomas, L., and Zander, C. 2008. Debugging: the good, the bad, and the quirky -- a qualitative analysis of novices' strategies. SIGCSE Bull. 40, 1 (Feb. 2008), 163-167

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Breen Smyth, Morrisey, M., Northern Ireland After the Good Friday Agreement: Victims, Grievance and Blame (Pluto Press, 2002), pp.xiii+247 RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mavron, Vassili; McDonough, T.P.; Schrikhande, M.S., (2003) 'Quasi -symmetric designs with good blocks and intersection number one', Designs Codes and Cryptography 28(2) pp.147-162 RAE2008

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Research Report from the "Organizing Religious Work Project," Hartford Institute for Religion Research Hartford Seminary

Relevância:

20.00% 20.00%

Publicador:

Resumo:

http://www.archive.org/details/adayofgoodtiding00keenuoft

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the power of genetic algorithms at solving the MAX-CLIQUE problem. We measure the performance of a standard genetic algorithm on an elementary set of problem instances consisting of embedded cliques in random graphs. We indicate the need for improvement, and introduce a new genetic algorithm, the multi-phase annealed GA, which exhibits superior performance on the same problem set. As we scale up the problem size and test on \hard" benchmark instances, we notice a degraded performance in the algorithm caused by premature convergence to local minima. To alleviate this problem, a sequence of modi cations are implemented ranging from changes in input representation to systematic local search. The most recent version, called union GA, incorporates the features of union cross-over, greedy replacement, and diversity enhancement. It shows a marked speed-up in the number of iterations required to find a given solution, as well as some improvement in the clique size found. We discuss issues related to the SIMD implementation of the genetic algorithms on a Thinking Machines CM-5, which was necessitated by the intrinsically high time complexity (O(n3)) of the serial algorithm for computing one iteration. Our preliminary conclusions are: (1) a genetic algorithm needs to be heavily customized to work "well" for the clique problem; (2) a GA is computationally very expensive, and its use is only recommended if it is known to find larger cliques than other algorithms; (3) although our customization e ort is bringing forth continued improvements, there is no clear evidence, at this time, that a GA will have better success in circumventing local minima.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fast forward error correction codes are becoming an important component in bulk content delivery. They fit in naturally with multicast scenarios as a way to deal with losses and are now seeing use in peer to peer networks as a basis for distributing load. In particular, new irregular sparse parity check codes have been developed with provable average linear time performance, a significant improvement over previous codes. In this paper, we present a new heuristic for generating codes with similar performance based on observing a server with an oracle for client state. This heuristic is easy to implement and provides further intuition into the need for an irregular heavy tailed distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A common design of an object recognition system has two steps, a detection step followed by a foreground within-class classification step. For example, consider face detection by a boosted cascade of detectors followed by face ID recognition via one-vs-all (OVA) classifiers. Another example is human detection followed by pose recognition. Although the detection step can be quite fast, the foreground within-class classification process can be slow and becomes a bottleneck. In this work, we formulate a filter-and-refine scheme, where the binary outputs of the weak classifiers in a boosted detector are used to identify a small number of candidate foreground state hypotheses quickly via Hamming distance or weighted Hamming distance. The approach is evaluated in three applications: face recognition on the FRGC V2 data set, hand shape detection and parameter estimation on a hand data set and vehicle detection and view angle estimation on a multi-view vehicle data set. On all data sets, our approach has comparable accuracy and is at least five times faster than the brute force approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work is a study of the Middle English prose text known as The Wise Book of Philosophy and Astronomy, a consideration of its transmission and reception history, and a survey of its manuscript witnesses; it also incorporates an edition of the text from two of its manuscripts. The text is a cosmological treatise of approximately five thousand words, written for the most part in English, with astronomical and astrological terms in Latin, though the English translation is frequently given. It is written anonymously, and survives in thirty-three manuscripts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The central research question that this thesis addresses is whether there is a significant gap between fishery stakeholder values and the principles and policy goals implicit in an Ecosystem Approach to Fisheries Management (EAFM). The implications of such a gap for fisheries governance are explored. Furthermore an assessment is made of what may be practically achievable in the implementation of an EAFM in fisheries in general and in a case study fishery in particular. The research was mainly focused on a particular case study, the Celtic Sea Herring fishery and its management committee, the Celtic Sea Herring Management Advisory Committee (CSHMAC). The Celtic Sea Herring fishery exhibits many aspects of an EAFM and the fish stock has successfully recovered to healthy levels in the past 5 years. However there are increasing levels of governance related conflict within the fishery which threaten the future sustainability of the stock. Previous research on EAFM governance has tended to focus either on higher levels of EAFM governance or on individual behaviour but very little research has attempted to link the two spheres or explore the relationship between them. Two main themes within this study aimed to address this gap. The first was what role governance could play in facilitating EAFM implementation. The second theme concerned the degree of convergence between high-level EAFM goals and stakeholder values. The first method applied was governance benchmarking to analyse systemic risks to EAFM implementation. This found that there are no real EU or national level policies which provide stakeholders or managers with clear targets for EAFM implementation. The second method applied was the use of cognitive mapping to explore stakeholders understandings of the main ecological, economic and institutional driving forces in the Celtic Sea Herring fishery. The main finding from this was that a long-term outlook can and has been incentivised through a combination of policy drivers and participatory management. However the fundamental principle of EAFM, accounting for ecosystem linkages rather than target stocks was not reflected in stakeholders cognitive maps. This was confirmed in a prioritisation of stakeholders management priorities using Analytic Hierarchy Process which found that the overriding concern is for protection of target stock status but that wider ecosystem health was not a priority for most management participants. The conclusion reached is that moving to sustainable fisheries may be a more complex process than envisioned in much of the literature and may consist of two phases. The first phase is a transition to a long-term but still target stock focused approach. This achievable transition is mainly a strategic change, which can be incentivised by policies and supported by stakeholders. In the Celtic Sea Herring fishery, and an increasing number of global and European fisheries, such transitions have contributed to successful stock recoveries. The second phase however, implementation of an ecosystem approach, may present a greater challenge in terms of governability, as this research highlights some fundamental conflicts between stakeholder perceptions and values and those inherent in an EAFM. This phase may involve the setting aside of fish for non-valued ecosystem elements and will require either a pronounced mind-set and value change or some strong top-down policy incentives in order to succeed. Fisheries governance frameworks will need to carefully explore the most effective balance between such endogenous and exogenous solutions. This finding of low prioritisation of wider ecosystem elements has implications for rights based management within an ecosystem approach, regardless of whether those rights are individual or collective.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The music of women composers often comprises only a small percentage of flutists‘ repertoire, yet there are actually many active women composers, many of whom have written for the flute. The aim of this dissertation is to chronicle a selection of works by several American women composers that have contributed to accessible flute repertoire. For the purpose of this dissertation, accessibility is described by the following parameters: works that limit the use of extended techniques, works that are suitable for performers from high school through a reasonably advanced level, works that are likely to elicit emotionally musical communication from the performer to the listener, and works that are reasonably available through music stores or outlets on the Internet that have a fairly comprehensive reach to the general public. My subjective judgment also played a role in the final selection of the 25 works included as part of this dissertation, and performed on three musically well-balanced recitals. A variety of resources were consulted for the repertoire, including Boenke‘s Flute Music by Women Composers: An Annotated Catalog, and the catalogs of publishers such as Arsis Press and Hildegard Publishing, both of which specialize in the music of women composers. The works performed and discussed are the following: Adrienne Albert – Sunswept; Marion Bauer – Prelude and Fugue, Op. 43.; Marilyn Bliss – Lament; Ann Callaway – Updraft; Ruth Crawford – Diaphonic Suite; Emma Lou Diemer – Sonata; Vivian Fine – Emily’s Images; Cynthia Folio – Arca Sacra; Nancy Galbraith – Atacama; Lita Grier – Sonata; Jennifer Higdon – The Jeffrey Mode; Edie Hill – This Floating World; Katherine Hoover – Masks; Mary Howe – Interlude between Two Pieces; Laura Kaminsky – Duo; Libby Larsen – Aubade; Alex Shapiro – Shiny Kiss; Judith Shatin – Coursing Through the Still Green; Faye-Ellen Silverman – Taming the Furies; Augusta Read Thomas – Euterpe’s Caprice; Joan Tower – Valentine Trills; Ludmila Ulehla – Capriccio; Elizabeth Vercoe – Kleemation; Gwyneth Walker – Sonata; and Judith Lang Zaimont – ‘Bubble-Up’ Rag. All of these works are worthy alternatives to the more frequently played flute repertoire, and they serve as a good starting point for anyone interested i n exploring the works of women composers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our media is saturated with claims of ``facts'' made from data. Database research has in the past focused on how to answer queries, but has not devoted much attention to discerning more subtle qualities of the resulting claims, e.g., is a claim ``cherry-picking''? This paper proposes a Query Response Surface (QRS) based framework that models claims based on structured data as parameterized queries. A key insight is that we can learn a lot about a claim by perturbing its parameters and seeing how its conclusion changes. This framework lets us formulate and tackle practical fact-checking tasks --- reverse-engineering vague claims, and countering questionable claims --- as computational problems. Within the QRS based framework, we take one step further, and propose a problem along with efficient algorithms for finding high-quality claims of a given form from data, i.e. raising good questions, in the first place. This is achieved to using a limited number of high-valued claims to represent high-valued regions of the QRS. Besides the general purpose high-quality claim finding problem, lead-finding can be tailored towards specific claim quality measures, also defined within the QRS framework. An example of uniqueness-based lead-finding is presented for ``one-of-the-few'' claims, landing in interpretable high-quality claims, and an adjustable mechanism for ranking objects, e.g. NBA players, based on what claims can be made for them. Finally, we study the use of visualization as a powerful way of conveying results of a large number of claims. An efficient two stage sampling algorithm is proposed for generating input of 2d scatter plot with heatmap, evalutaing a limited amount of data, while preserving the two essential visual features, namely outliers and clusters. For all the problems, we present real-world examples and experiments that demonstrate the power of our model, efficiency of our algorithms, and usefulness of their results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Office-based percutaneous revision of a testicular prosthesis has never been reported. A patient received a testicular prosthesis but was dissatisfied with the firmness of the implant. In an office setting, the prosthesis was inflated with additional fluid via a percutaneous approach. Evaluated outcomes included patient satisfaction, prosthesis size, recovery time, and cost savings. The patient was satisfied, with no infection, leak, or complication after more than 1 year of follow-up, at significantly less cost than revision surgery. Percutaneous adjustment of testicular prosthesis fill-volume can be safe, inexpensive, and result in good patient satisfaction.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper concerns a preliminary numerical simulation study of the evacuation of the World Trade Centre North Tower on 11 September 2001 using the buildingEXODUS evacuation simulation software. The analysis makes use of response time data derived from a study of survivor accounts appearing in the public domain. While exact geometric details of the building were not available for this study, the building geometry was approximated from descriptions available in the public domain. The study attempts to reproduce the events of 11 September 2001 and pursue several ‘what if’ questions concerning the evacuation. In particular, the study explores the likely outcome had a single staircase survived in tact from top to bottom.