869 resultados para World Heritage
Resumo:
Recently the notion of self-similarity has been shown to apply to wide-area and local-area network traffic. In this paper we examine the mechanisms that give rise to self-similar network traffic. We present an explanation for traffic self-similarity by using a particular subset of wide area traffic: traffic due to the World Wide Web (WWW). Using an extensive set of traces of actual user executions of NCSA Mosaic, reflecting over half a million requests for WWW documents, we show evidence that WWW traffic is self-similar. Then we show that the self-similarity in such traffic can be explained based on the underlying distributions of WWW document sizes, the effects of caching and user preference in file transfer, the effect of user "think time", and the superimposition of many such transfers in a local area network. To do this we rely on empirically measured distributions both from our traces and from data independently collected at over thirty WWW sites.
Resumo:
We propose the development of a world wide web image search engine that crawls the web collecting information about the images it finds, computes the appropriate image decompositions and indices, and stores this extracted information for searches based on image content. Indexing and searching images need not require solving the image understanding problem. Instead, the general approach should be to provide an arsenal of image decompositions and discriminants that can be precomputed for images. At search time, users can select a weighted subset of these decompositions to be used for computing image similarity measurements. While this approach avoids the search-time-dependent problem of labeling what is important in images, it still holds several important problems that require further research in the area of query by image content. We briefly explore some of these problems as they pertain to shape.
Resumo:
Server performance has become a crucial issue for improving the overall performance of the World-Wide Web. This paper describes Webmonitor, a tool for evaluating and understanding server performance, and presents new results for a realistic workload. Webmonitor measures activity and resource consumption, both within the kernel and in HTTP processes running in user space. Webmonitor is implemented using an efficient combination of sampling and event-driven techniques that exhibit low overhead. Our initial implementation is for the Apache World-Wide Web server running on the Linux operating system. We demonstrate the utility of Webmonitor by measuring and understanding the performance of a Pentium-based PC acting as a dedicated WWW server. Our workload uses a file size distribution with a heavy tail. This captures the fact that Web servers must concurrently handle some requests for large audio and video files, and a large number of requests for small documents, containing text or images. Our results show that in a Web server saturated by client requests, over 90% of the time spent handling HTTP requests is spent in the kernel. Furthermore, keeping TCP connections open, as required by TCP, causes a factor of 2-9 increase in the elapsed time required to service an HTTP request. Data gathered from Webmonitor provide insight into the causes of this performance penalty. Specifically, we observe a significant increase in resource consumption along three dimensions: the number of HTTP processes running at the same time, CPU utilization, and memory utilization. These results emphasize the important role of operating system and network protocol implementation in determining Web server performance.
Resumo:
ImageRover is a search by image content navigation tool for the world wide web. To gather images expediently, the image collection subsystem utilizes a distributed fleet of WWW robots running on different computers. The image robots gather information about the images they find, computing the appropriate image decompositions and indices, and store this extracted information in vector form for searches based on image content. At search time, users can iteratively guide the search through the selection of relevant examples. Search performance is made efficient through the use of an approximate, optimized k-d tree algorithm. The system employs a novel relevance feedback algorithm that selects the distance metrics appropriate for a particular query.
Resumo:
Recent work has shown the prevalence of small-world phenomena [28] in many networks. Small-world graphs exhibit a high degree of clustering, yet have typically short path lengths between arbitrary vertices. Internet AS-level graphs have been shown to exhibit small-world behaviors [9]. In this paper, we show that both Internet AS-level and router-level graphs exhibit small-world behavior. We attribute such behavior to two possible causes–namely the high variability of vertex degree distributions (which were found to follow approximately a power law [15]) and the preference of vertices to have local connections. We show that both factors contribute with different relative degrees to the small-world behavior of AS-level and router-level topologies. Our findings underscore the inefficacy of the Barabasi-Albert model [6] in explaining the growth process of the Internet, and provide a basis for more promising approaches to the development of Internet topology generators. We present such a generator and show the resemblance of the synthetic graphs it generates to real Internet AS-level and router-level graphs. Using these graphs, we have examined how small-world behaviors affect the scalability of end-system multicast. Our findings indicate that lower variability of vertex degree and stronger preference for local connectivity in small-world graphs results in slower network neighborhood expansion, and in longer average path length between two arbitrary vertices, which in turn results in better scaling of end system multicast.
Resumo:
Some WWW image engines allow the user to form a query in terms of text keywords. To build the image index, keywords are extracted heuristically from HTML documents containing each image, and/or from the image URL and file headers. Unfortunately, text-based image engines have merely retro-fitted standard SQL database query methods, and it is difficult to include images cues within such a framework. On the other hand, visual statistics (e.g., color histograms) are often insufficient for helping users find desired images in a vast WWW index. By truly unifying textual and visual statistics, one would expect to get better results than either used separately. In this paper, we propose an approach that allows the combination of visual statistics with textual statistics in the vector space representation commonly used in query by image content systems. Text statistics are captured in vector form using latent semantic indexing (LSI). The LSI index for an HTML document is then associated with each of the images contained therein. Visual statistics (e.g., color, orientedness) are also computed for each image. The LSI and visual statistic vectors are then combined into a single index vector that can be used for content-based search of the resulting image database. By using an integrated approach, we are able to take advantage of possible statistical couplings between the topic of the document (latent semantic content) and the contents of images (visual statistics). This allows improved performance in conducting content-based search. This approach has been implemented in a WWW image search engine prototype.
Resumo:
In a constantly changing world, humans are adapted to alternate routinely between attending to familiar objects and testing hypotheses about novel ones. We can rapidly learn to recognize and narne novel objects without unselectively disrupting our memories of familiar ones. We can notice fine details that differentiate nearly identical objects and generalize across broad classes of dissimilar objects. This chapter describes a class of self-organizing neural network architectures--called ARTMAP-- that are capable of fast, yet stable, on-line recognition learning, hypothesis testing, and naming in response to an arbitrary stream of input patterns (Carpenter, Grossberg, Markuzon, Reynolds, and Rosen, 1992; Carpenter, Grossberg, and Reynolds, 1991). The intrinsic stability of ARTMAP allows the system to learn incrementally for an unlimited period of time. System stability properties can be traced to the structure of its learned memories, which encode clusters of attended features into its recognition categories, rather than slow averages of category inputs. The level of detail in the learned attentional focus is determined moment-by-moment, depending on predictive success: an error due to over-generalization automatically focuses attention on additional input details enough of which are learned in a new recognition category so that the predictive error will not be repeated. An ARTMAP system creates an evolving map between a variable number of learned categories that compress one feature space (e.g., visual features) to learned categories of another feature space (e.g., auditory features). Input vectors can be either binary or analog. Computational properties of the networks enable them to perform significantly better in benchmark studies than alternative machine learning, genetic algorithm, or neural network models. Some of the critical problems that challenge and constrain any such autonomous learning system will next be illustrated. Design principles that work together to solve these problems are then outlined. These principles are realized in the ARTMAP architecture, which is specified as an algorithm. Finally, ARTMAP dynamics are illustrated by means of a series of benchmark simulations.
Resumo:
When we look at a scene, how do we consciously see surfaces infused with lightness and color at the correct depths? Random Dot Stereograms (RDS) probe how binocular disparity between the two eyes can generate such conscious surface percepts. Dense RDS do so despite the fact that they include multiple false binocular matches. Sparse stereograms do so even across large contrast-free regions with no binocular matches. Stereograms that define occluding and occluded surfaces lead to surface percepts wherein partially occluded textured surfaces are completed behind occluding textured surfaces at a spatial scale much larger than that of the texture elements themselves. Earlier models suggest how the brain detects binocular disparity, but not how RDS generate conscious percepts of 3D surfaces. A neural model predicts how the layered circuits of visual cortex generate these 3D surface percepts using interactions between visual boundary and surface representations that obey complementary computational rules.
Resumo:
The impact of the Vietnam War conditioned the Carter administration’s response to the Nicaraguan revolution in ways that reduced US engagement with both sides of the conflict. It made the countries of Latin America counter the US approach and find their own solution to the crisis, and allowed Cuba to play a greater role in guiding the overthrow of Nicaraguan dictator Anastasio Somoza Debayle. This thesis re-evaluates Carter’s policy through the legacy of the Vietnam War, because US executive anxieties about military intervention, Congress’s increasing influence, and US public concerns about the nation’s global responsibilities, shaped the Carter approach to Nicaragua. Following a background chapter, the Carter administration’s policy towards Nicaragua is evaluated, before and after the fall of Somoza in July 1979. The extent of the Vietnam influence on US-Nicaraguan relations is developed by researching government documents on the formation of US policy, including material from the Jimmy Carter Library, the Library of Congress, the National Security Archive, the National Archives and Records Administration, and other government and media sources from the United Nations Archives, New York University, the New York Public Library, the Hoover Institution Archives, Tulane University and the Organization of American States. The thesis establishes that the Vietnam legacy played a key role in the Carter administration’s approach to Nicaragua. Before the overthrow of Somoza, the Carter administration limited their influence in Nicaragua because they felt there was no immediate threat from communism. The US feared that an active role in Nicaragua, without an established threat from Cuba or the Soviet Union, could jeopardise congressional support for other foreign policy goals deemed more important. The Carter administration, as a result, pursued a policy of non-intervention towards the Central American country. After the fall of Somoza, and the establishment of a new government with a left wing element represented by the Sandinistas, the Carter administration emphasised non-intervention in a military sense, but actively engaged with the new Nicaraguan leadership to contain the potential communist influence that could spread across Central America in the wake of the Nicaraguan revolution.
Resumo:
Since the age of colonisation, the territory of New Mexico has been exposed to a diversity of cultural influence. Throughout recorded history various forces have battled for control of this territory, resulting in a continuous redefinition of its political, geographic and economic boundaries. Early representations of the Southwest have been defined as “strategies of negotiation” between Anglo, Hispanic and Native populations, strategies that are particularly evident in the territory of New Mexico. The contemporary identity of regions like northern New Mexico have destabilised the notion of what constitutes racial purity in regions which are defined by diversity. This thesis aims to evaluate the literary history of northern New Mexico in order to determine how exposure to a diversity of cultural influence has affected the region’s identity. An analysis of Anglo and Native writers from northern New Mexico will illustrate that these racial groups were influenced by the same geographic landscape. As such, their writing displays many characteristics unique to the region. In providing a comparative analysis of Native and Anglo authors from northern New Mexico, this thesis seeks to demonstrate commonalities of theme, structure and content. In doing so this research encourages a new perspective on New Mexico writing one which effectively de-centres contemporary notions of what the American canon should be.
Resumo:
This study assesses regional health patterns in early medieval Ireland and Britain by analysing and interpreting palaeopathological indicators of stress. This was achieved by incorporating the results of demographic and palaeopathological study into the specific historical contexts. Although relatively small islands, both are home to unique and diverse cultural, physical, and political landscapes, which could potentially affect the general health of the population in different ways. To accurately answer the research question, a bioarchaeological survey of six regions within both islands was carried out, specifically analysing and comparing the demographic profile and general health trends within each region with one another. Results from the analysis have demonstrated statistically significant differences within and between the islands. Inferring that even the more subtle differences observed within the cultural, physical, and political landscapes, such as in the case of Ireland and Britain, can and do affect general health trends. The health of early medieval Ireland and Britain appears to be significantly affected by the physical landscape, specifically a north/south divide. The most northerly regions, Scotland South and Ireland North, manifested higher levels of stress indicators when compared to the more southerly positioned regions. Although it can only be hypothesised what factors within these regions are causing, enhancing or buffering stress, the study has established the potential and necessity for regional work to be continued when interpreting the historical past of these two islands.
Resumo:
Process guidance supports users to increase their process model understanding, process execution effectiveness as well as efficiency, and process compliance performance. This paper presents a research in progress encompassing our ongoing DSR project on Process Guidance Systems and a field evaluation of the resulting artifact in cooperation with a company. Building on three theory-grounded design principles, a Process Guidance System artifact for the company’s IT service ticketing process is developed, deployed and used. Fol-lowing a multi-method approach, we plan to evaluate the artifact in a longitudinal field study. Thereby, we will not only gather self-reported but also real usage data. This article describes the development of the artifact and discusses an innovative evaluation approach.
Resumo:
This article will explore the contribution made to the construction of discourse around religion outside of mainstream Christianity, at the turn of the twentieth century in Britain, by a Celticist movement as represented by Wellesley Tudor Pole (d.1968) and his connection to the Glastonbury phenomenon. I will detail the interconnectedness of individuals and movements occupying this discursive space and their interest in efforts to verify the authenticity of an artefact which Tudor Pole claimed was once in the possession of Jesus. Engagement with Tudor Pole’s quest to prove the provenance of the artefact, and his contention that a pre-Christian culture had existed in Ireland which had extended itself to Glastonbury and Iona creating the foundation for an authentic Western mystical tradition, is presented as one facet of a broader, contemporary discourse on alternative ideas and philosophies. In conclusion, I will juxtapose Tudor Pole’s fascination with Celtic origins and the approach of leading figures in the ‘Celtic Revival’ in Ireland, suggesting intersections and alterity in the construction of their worldview. The paper forms part of a chapter in a thesis under-preparation which examines the construction of discourse on religion outside of mainstream Christianity at the turn of the twentieth century, and in particular the role played by visiting religious reformers from Asia. The aim is to recover the (mostly forgotten) history of these engagements.
Resumo:
The universality versus culture specificity of quantitative evaluations (negative-positive) of 40 events in world history was addressed using World History Survey data collected from 5,800 university students in 30 countries/societies. Multidimensional scaling using generalized procrustean analysis indicated poor fit of data from the 30 countries to an overall mean configuration, indicating lack of universal agreement as to the associational meaning of events in world history. Hierarchical cluster analysis identified one Western and two non-Western country clusters for which adequate multidimensional fit was obtained after item deletions. A two-dimensional solution for the three country clusters was identified, where the primary dimension was historical calamities versus progress and a weak second dimension was modernity versus resistance to modernity. Factor analysis further reduced the item inventory to identify a single concept with structural equivalence across cultures, Historical Calamities, which included man-made and natural, intentional and unintentional, predominantly violent but also nonviolent calamities. Less robust factors were tentatively named as Historical Progress and Historical Resistance to Oppression. Historical Calamities and Historical Progress were at the individual level both significant and independent predictors of willingness to fight for one’s country in a hierarchical linear model that also identified significant country-level variation in these relationships. Consensus around calamity but disagreement as to what constitutes historical progress is discussed in relation to the political culture of nations and lay perceptions of history as catastrophe.