159 resultados para Concept Map
Resumo:
National park models have evolved in tandem with the emergence of a multifunctional countryside. Sustainable development has been added to the traditional twin aims of conservation and recreation. This is typified by recent national park designations, such as the Cairngorms National Park in Scotland. A proposed Mournes national park in Northern Ireland has evolved a stage further with a model of national park to deliver national economic goals envisaged by government. This seeks to commodify the natural landscape. This paper compares Cairngorm and Mourne stakeholders’ views on the principal features of both models: park aims, management structures and planning functions. While Cairngorm stakeholders were largely positive from the outset, the model of national park introduced is not without criticism. Conversely, Mourne stakeholders have adopted an anti-national park stance. Nevertheless, the model of national park proposed possessing a strong economic imperative, an absence of the Sandford Principle as a means to manage likely conflicts, and lacking any planning powers in its own right, may still be insufficient to bring about widespread support for a Mourne national park. Such a model is also likely to accelerate the degradation of the Mourne landscape. Competing national identities (British and Irish) provide an additional dimension to the national park debate in Northern Ireland. Deep ideological cleavages are capable of derailing the introduction of a national park irrespective of the model proposed. In Northern Ireland the national park debate is not only about reconciling environmental and economic interests but also political and ethno-national differences.
Resumo:
The X-linked lymphoproliferative syndrome (XLP) is an inherited immuno-deficiency to Epstein-Barr virus infection that has been mapped to chromosome Xq25. Molecular analysis of XLP patients from ten different families identified a small interstitial constitutional deletion in 1 patient (XLP-D). This deletion, initially defined by a single marker, DF83, known to map to interval Xq24-q26.1, is nested within a previously reported and much larger deletion in another XLP patient (XLP-739). A cosmid minilibrary was constructed from a single mega-YAC and used to establish a contig encompassing the whole XLP-D deletion and a portion of the XLP-739 deletion. Based on this contig, the size of the XLP-D deletion can be estimated at 130 kb. The identification of this minimal deletion, within which at least a portion of the XLP gene is likely to reside, should greatly facilitate efforts in isolating the gene.
Resumo:
Scholars and practitioners working in ‘transitional justice’ are concerned with remedies of accountability and redress in the aftermath of conflict and state repression. Transitional justice, it is argued, provides recognition of the rights of victims, promotes civic trust, and strengthens the democratic rule of law. As serious scholarship flourishes around this critical concept as never before, this new collection from Routledge meets the need for an authoritative reference work to map a vibrant site of research and reflection. In four volumes, Transitional Justice brings together foundational and the best and most influential cutting-edge materials, including key works produced before the term ‘transitional justice’ gained wide currency but which anticipate approaches now included under that rubric.
The collection covers themes such as: truth and history; acknowledgement, reconciliation, and forgiveness; retribution, restorative justice and reparations; and democracy, state-building, identity, and civil society
Resumo:
Scholars and practitioners working in ‘transitional justice’ are concerned with remedies of accountability and redress in the aftermath of conflict and state repression. Transitional justice, it is argued, provides recognition of the rights of victims, promotes civic trust, and strengthens the democratic rule of law. As serious scholarship flourishes around this critical concept as never before, this new collection from Routledge meets the need for an authoritative reference work to map a vibrant site of research and reflection. In four volumes, Transitional Justice brings together foundational and the best and most influential cutting-edge materials, including key works produced before the term ‘transitional justice’ gained wide currency but which anticipate approaches now included under that rubric.
The collection covers themes such as: truth and history; acknowledgement, reconciliation, and forgiveness; retribution, restorative justice and reparations; and democracy, state-building, identity, and civil society
Resumo:
This paper describes an investigation of various shroud bleed slot configurations of a centrifugal compressor using CFD with a manual multi-block structured grid generation method. The compressor under investigation is used in a turbocharger application for a heavy duty diesel engine of approximately 400hp. The baseline numerical model has been developed and validated against experimental performance measurements. The influence of the bleed slot flow field on a range of operating conditions between surge and choke has been analysed in detail. The impact of the returning bleed flow on the incidence at the impeller blade leading edge due to its mixing with the main through-flow has also been studied. From the baseline geometry, a number of modifications to the bleed slot width have been proposed, and a detailed comparison of the flow characteristics performed. The impact of slot variations on the inlet incidence angle has been investigated, highlighting the improvement in surge and choked flow capability. Along with this, the influence of the bleed slot on stabilizing the blade passage flow by the suction of the tip and over-tip vortex flow by the slot has been considered near surge.
Resumo:
As data analytics are growing in importance they are also quickly becoming one of the dominant application domains that require parallel processing. This paper investigates the applicability of OpenMP, the dominant shared-memory parallel programming model in high-performance computing, to the domain of data analytics. We contrast the performance and programmability of key data analytics benchmarks against Phoenix++, a state-of-the-art shared memory map/reduce programming system. Our study shows that OpenMP outperforms the Phoenix++ system by a large margin for several benchmarks. In other cases, however, the programming model is lacking support for this application domain.
Resumo:
Testing the hypothesis that the concept of translation is evaluative rather than merely descriptive, Blumczyński analyses its increasingly popular use in three areas: political discourse, life writing and biomedical publications. He argues that translation as an evaluative concept is concerned with profound rather than superficial issues: to translate something is to assert its significance and value. At the same time, translation brings to the surface real and authentic things, producing its therapeutic value: it makes us more visible to ourselves, exposes pretences and thus brings relief. Finally, translation delivers on its own ethical imperative by breaking the spell of proverbial good intentions and bringing things to completion.
Resumo:
Single component geochemical maps are the most basic representation of spatial elemental distributions and commonly used in environmental and exploration geochemistry. However, the compositional nature of geochemical data imposes several limitations on how the data should be presented. The problems relate to the constant sum problem (closure), and the inherently multivariate relative information conveyed by compositional data. Well known is, for instance, the tendency of all heavy metals to show lower values in soils with significant contributions of diluting elements (e.g., the quartz dilution effect); or the contrary effect, apparent enrichment in many elements due to removal of potassium during weathering. The validity of classical single component maps is thus investigated, and reasonable alternatives that honour the compositional character of geochemical concentrations are presented. The first recommended such method relies on knowledge-driven log-ratios, chosen to highlight certain geochemical relations or to filter known artefacts (e.g. dilution with SiO2 or volatiles). This is similar to the classical normalisation approach to a single element. The second approach uses the (so called) log-contrasts, that employ suitable statistical methods (such as classification techniques, regression analysis, principal component analysis, clustering of variables, etc.) to extract potentially interesting geochemical summaries. The caution from this work is that if a compositional approach is not used, it becomes difficult to guarantee that any identified pattern, trend or anomaly is not an artefact of the constant sum constraint. In summary the authors recommend a chain of enquiry that involves searching for the appropriate statistical method that can answer the required geological or geochemical question whilst maintaining the integrity of the compositional nature of the data. The required log-ratio transformations should be applied followed by the chosen statistical method. Interpreting the results may require a closer working relationship between statisticians, data analysts and geochemists.
Resumo:
In this study, we introduce an original distance definition for graphs, called the Markov-inverse-F measure (MiF). This measure enables the integration of classical graph theory indices with new knowledge pertaining to structural feature extraction from semantic networks. MiF improves the conventional Jaccard and/or Simpson indices, and reconciles both the geodesic information (random walk) and co-occurrence adjustment (degree balance and distribution). We measure the effectiveness of graph-based coefficients through the application of linguistic graph information for a neural activity recorded during conceptual processing in the human brain. Specifically, the MiF distance is computed between each of the nouns used in a previous neural experiment and each of the in-between words in a subgraph derived from the Edinburgh Word Association Thesaurus of English. From the MiF-based information matrix, a machine learning model can accurately obtain a scalar parameter that specifies the degree to which each voxel in (the MRI image of) the brain is activated by each word or each principal component of the intermediate semantic features. Furthermore, correlating the voxel information with the MiF-based principal components, a new computational neurolinguistics model with a network connectivity paradigm is created. This allows two dimensions of context space to be incorporated with both semantic and neural distributional representations.
Resumo:
The environmental quality of land is often assessed by the calculation of threshold values which aim to differentiate between concentrations of elements based on whether the soils are in residential or industrial sites. In Europe, for example, soil guideline values exist for agricultural and grazing land. A threshold is often set to differentiate between concentrations of the element that naturally occur in the soil and concentrations that result from diffuse anthropogenic sources. Regional geochemistry and, in particular, single component geochemical maps are increasingly being used to determine these baseline environmental assessments. The key question raised in this paper is whether the geochemical map can provide an accurate interpretation on its own. Implicit is the thought that single component geochemical maps represent absolute abundances. However,because of the compositional (closed) nature of the data univariate geochemical maps cannot be compared directly with one another.. As a result, any interpretation based on them is vulnerable to spurious correlation problems. What does this mean for soil geochemistry mapping, baseline quality documentation, soil resource assessment or risk evaluation? Despite the limitation of relative abundances, individual raw geochemical maps are deemed fundamental to several applications of geochemical maps including environmental assessments. However, element toxicity is related to its bioavailable concentration, which is lowered if its source is mixed with another source. Elements interact, for example under reducing conditions with iron oxides, its solid state is lost and arsenic becomes soluble and mobile. Both of these matters may be more adequately dealt with if a single component map is not interpreted in isolation to determine baseline and threshold assessments. A range of alternative compositionally compliant representations based on log-ratio and log-contrast approaches are explored to supplement the classical single component maps for environmental assessment. Case study examples are shown based on the Tellus soil geochemical dataset, covering Northern Ireland and the results of in vitro oral bioaccessibility testing carried out on a sub-set of archived Tellus Survey shallow soils following the Unified BARGE (Bioaccessibility Research Group of Europe).
Resumo:
We study the computational complexity of finding maximum a posteriori configurations in Bayesian networks whose probabilities are specified by logical formulas. This approach leads to a fine grained study in which local information such as context-sensitive independence and determinism can be considered. It also allows us to characterize more precisely the jump from tractability to NP-hardness and beyond, and to consider the complexity introduced by evidence alone.