985 resultados para Generating summaries


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research reported here concerns the principles used to automatically generate three-dimensional representations from line drawings of scenes. The computer programs involved look at scenes which consist of polyhedra and which may contain shadows and various kinds of coincidentally aligned scene features. Each generated description includes information about edge shape (convex, concave, occluding, shadow, etc.), about the type of illumination for each region (illuminated, projected shadow, or oriented away from the light source), and about the spacial orientation of regions. The methods used are based on the labeling schemes of Huffman and Clowes; this research provides a considerable extension to their work and also gives theoretical explanations to the heuristic scene analysis work of Guzman, Winston, and others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Martin Huelse: Generating complex connectivity structures for large-scale neural models. In: V. Kurkova, R. Neruda, and J. Koutnik (Eds.): ICANN 2008, Part II, LNCS 5164, pp. 849?858, 2008. Sponsorship: EPSRC

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Unpublished paper, written in 1996.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One role for workload generation is as a means for understanding how servers and networks respond to variation in load. This enables management and capacity planning based on current and projected usage. This paper applies a number of observations of Web server usage to create a realistic Web workload generation tool which mimics a set of real users accessing a server. The tool, called Surge (Scalable URL Reference Generator) generates references matching empirical measurements of 1) server file size distribution; 2) request size distribution; 3) relative file popularity; 4) embedded file references; 5) temporal locality of reference; and 6) idle periods of individual users. This paper reviews the essential elements required in the generation of a representative Web workload. It also addresses the technical challenges to satisfying this large set of simultaneous constraints on the properties of the reference stream, the solutions we adopted, and their associated accuracy. Finally, we present evidence that Surge exercises servers in a manner significantly different from other Web server benchmarks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fast forward error correction codes are becoming an important component in bulk content delivery. They fit in naturally with multicast scenarios as a way to deal with losses and are now seeing use in peer to peer networks as a basis for distributing load. In particular, new irregular sparse parity check codes have been developed with provable average linear time performance, a significant improvement over previous codes. In this paper, we present a new heuristic for generating codes with similar performance based on observing a server with an oracle for client state. This heuristic is easy to implement and provides further intuition into the need for an irregular heavy tailed distribution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding and modeling the factors that underlie the growth and evolution of network topologies are basic questions that impact capacity planning, forecasting, and protocol research. Early topology generation work focused on generating network-wide connectivity maps, either at the AS-level or the router-level, typically with an eye towards reproducing abstract properties of observed topologies. But recently, advocates of an alternative "first-principles" approach question the feasibility of realizing representative topologies with simple generative models that do not explicitly incorporate real-world constraints, such as the relative costs of router configurations, into the model. Our work synthesizes these two lines by designing a topology generation mechanism that incorporates first-principles constraints. Our goal is more modest than that of constructing an Internet-wide topology: we aim to generate representative topologies for single ISPs. However, our methods also go well beyond previous work, as we annotate these topologies with representative capacity and latency information. Taking only demand for network services over a given region as input, we propose a natural cost model for building and interconnecting PoPs and formulate the resulting optimization problem faced by an ISP. We devise hill-climbing heuristics for this problem and demonstrate that the solutions we obtain are quantitatively similar to those in measured router-level ISP topologies, with respect to both topological properties and fault-tolerance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

INTRODUCTION: Accessing new knowledge as the evidence base for hospice and palliative care grows has specific challenges for the discipline. This study aimed to describe conversion rates of palliative and hospice care conference abstracts to journal articles and to highlight that some palliative care literature may not be retrievable because it is not indexed on bibliographic databases. METHODS: Substudy A tracked the journal publication of conference abstracts selected for inclusion in a gray literature database on www.caresearch.com.au . Abstracts were included in the gray literature database following handsearching of proceedings of over 100 Australian conferences likely to have some hospice or palliative care content that were held between 1980 and 1999. Substudy B looked at indexing from first publication until 2001 of three international hospice and palliative care journals in four widely available bibliographic databases through systematic tracing of all original papers in the journals. RESULTS: Substudy A showed that for the 1338 abstracts identified only 15.9% were published (compared to an average in health of 45%). Published abstracts were found in 78 different journals. Multiauthor abstracts and oral presentations had higher rates of conversion. Substudy B demonstrated lag time between first publication and bibliographic indexing. Even after listing, idiosyncratic noninclusions were identified. DISCUSSION: There are limitations to retrieval of all possible literature through electronic searching of bibliographic databases. Encouraging publication in indexed journals of studies presented at conferences, promoting selection of palliative care journals for database indexing, and searching more than one bibliographic database will improve the accessibility of existing and new knowledge in hospice and palliative care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many code generation tools exist to aid developers in carrying out common mappings, such as from Object to XML or from Object to relational database. Such generated code tends to possess a high binding between the Object code and the target mapping, making integration into a broader application tedious or even impossible. In this paper we suggest XML technologies and the multiple inheritance capabilities of interface based languages such as Java, offer a means to unify such executable specifications, thus building complete, consistent and useful object models declaratively, without sacrificing component flexibility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Code parallelization using OpenMP for shared memory systems is relatively easier than using message passing for distributed memory systems. Despite this, it is still a challenge to use OpenMP to parallelize application codes in a way that yields effective scalable performance when executed on a shared memory parallel system. We describe an environment that will assist the programmer in the various tasks of code parallelization and this is achieved in a greatly reduced time frame and level of skill required. The parallelization environment includes a number of tools that address the main tasks of parallelism detection, OpenMP source code generation, debugging and optimization. These tools include a high quality, fully interprocedural dependence analysis with user interaction capabilities to facilitate the generation of efficient parallel code, an automatic relative debugging tool to identify erroneous user decisions in that interaction and also performance profiling to identify bottlenecks. Finally, experiences of parallelizing some NASA application codes are presented to illustrate some of the benefits of using the evolving environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the start of the industrial revolution (circa 1750) the atmospheric concentration of carbon dioxide (CO2) was around 280 ppm. Since that time the burning of fossil fuel, together with other industrial processes such as cement manufacture and changing land use, has increased this value to 400 ppm, for the first time in over 3 million years. With CO2 being a potent greenhouse gas, the consequence of this rise for global temperatures has been dramatic, and not only for air temperatures. Global Sea Surface Temperature (SST) has warmed by 0.4–0.8 °C during the last century, although regional differences are evident (IPCC, 2007). This rise in atmospheric CO2 levels and the resulting global warming to some extent has been ameliorated by the oceanic uptake of around one quarter of the anthropogenic CO2 emissions (Sabine et al., 2004). Initially this was thought to be having little or no impact on ocean chemistry due to the capacity of the ocean’s carbonate buffering system to neutralise the acidity caused when CO2 dissolves in seawater. However, this assumption was challenged by Caldeira and Wickett (2005) who used model predictions to show that the rate at which carbonate buffering can act was far too slow to moderate significant changes to oceanic chemistry over the next few centuries. Their model predicted that since pre-industrial times, ocean surface water pH had fallen by 0.1 pH unit, indicating a 30% increase in the concentration of H+ ions. Their model also showed that the pH of surface waters could fall by up to 0.4 units before 2100, driven by continued and unabated utilisation of fossil fuels. Alongside increasing levels of dissolved CO2 and H+ (reduced pH) an increase in bicarbonate ions together with a decrease in carbonate ions occurs. These chemical changes are now collectively recognised as “ocean acidification”. Concern now stems from the knowledge that concentrations of H+, CO2, bicarbonate and carbonate ions impact upon many important physiological processes vital to maintaining health and function in marine organisms. Additionally, species have evolved under conditions where the carbonate system has remained relatively stable for millions of years, rendering them with potentially reduced capacity to adapt to this rapid change. Evidence suggests that, whilst the impact of ocean acidification is complex, when considered alongside ocean warming the net effect on the health and productivity of the oceans will be detrimental.