32 resultados para Abstraction

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cationic swede and anionic turnip peroxidases were partially purified by ion-exchange and gel-filtration chromatography, respectively. Heat treatment of these enzymes and of a commercial high purity horseradish peroxidase (HRP) caused a loss of enzyme activity and a corresponding increase in linoleic acid hydroperoxide formation activity. The hydroperoxide levels in model systems increased only in the early stages of the oxidation reaction and then declined as degradation became more significant. The presence of a dialysed blend of cooked swede markedly lowered the hydroperoxide level formed. Analysis of volatile compounds formed showed that hexanal predominated in a buffer system and in a blend of cooked turnip. In dialysed blends of cooked swede, hexanol was the primary volatile compound generated. After inactivation under mild conditions in the presence of EDTA, the peroxidases showed hydroperoxide formation activity and patterns of volatile compounds from linoleic acid that were similar to those found on heat-inactivation. This suggested that calcium abstraction from the peroxidases was critical for the enhancement of lipid oxidation activity. (C) 2002 Elsevier Science Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Commissioned print. Artist of the Month Club: February, 2010. January Curator: Mark Beasley. Invisible Exports Gallery, New York. Archival Inkjet Print on metallic silver polyester, 841 x 643mm. Edition of 50 + 10ap. Subsequently exhibited in the following exhibition: 'A Unicorn Basking in the Light of Three Glowing Suns' The Devos Art Museum School of Art & Design at Northern Michigan University October 8 – November 14, 2010 Curated by Anthony Elms and Philip von Zweck

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An AHRC funded project titled: Picturing ideas? Visualising and Synthesising Ideas as art (2009-10). Outputs including: 4 exhibitions; 4 publications; 3 papers; 2 largescale backlit digital prints; 1 commissioned print. (See Additional Information) ----ABSTRACT: Utilising the virtuality of digital imagery this practice-led project explored the possibility of the cross-articulation between text and image and the bridging or synthesising potential of the visual affect of ideas. A series of digital images were produced 'picturing' or 'visualising' philosophical ideas derived from the writings of the philosopher Giles Deleuze, as remodellings of pre-existing philosophical ideas; developed through dialogues and consultation with specialists in the fields from which the ideas were drawn (philosophy, psychology, film) as well as artists and theorists concerned with ideas of 'mental imagery' and visualisation. Final images were produced as a synthesis (or combination) of these visualisations and presented in the format of large scale, backlit digital prints at a series of prestigious international exhibitions (see details above). Evaluation took the form of a four page illustrated text in Frieze magazine (August 2009) and three papers delivered at University of Ulster, Goldsmiths College of Art and Loughborough University. The project also included the publication of a catalogue essay (EAST 09) and an illustrated poem (in the Dark Monarch publication). A print version of the image was commissioned by Invisible Exports Gallery, New York and subsequently exhibited in The Devos Art Museum, School of Art & Design at Northern Michigan University and in a publication edited by Cedar Lewisohn for Tate Publishing. The project was funded by an AHRC practice-led grant (17K) and Arts Council of England award (1.5K). The outputs, including high profile, publicly accessible exhibitions, prestigious publications and conference papers ensured the dissemination of the research to a wide range of audiences, including scholars/researchers across the arts and humanities engaged in practice-based and interdisciplinary theoretical work (in particular in the fields of contemporary art and art theory and those working on the integration of art and theory/philosophy/psychology) but also the wider audience for contemporary art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Environmental Data Abstraction Library provides a modular data management library for bringing new and diverse datatypes together for visualisation within numerous software packages, including the ncWMS viewing service, which already has very wide international uptake. The structure of EDAL is presented along with examples of its use to compare satellite, model and in situ data types within the same visualisation framework. We emphasize the value of this capability for cross calibration of datasets and evaluation of model products against observations, including preparation for data assimilation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Ramsar site of Lake Uluabat, western Turkey, suffers from eutrophication, urban and industrial pollution and water abstraction, and its water levels are managed artificially. Here we combine monitoring and palaeolimnological. techniques to investigate spatial and temporal limnological variability and ecosystem impact, using an ostracod and mollusc survey to strengthen interpretation of the fossil record. A combination of low invertebrate Biological Monitoring Working Party scores (<10) and the dominance of eutrophic diatoms in the modern lake confirms its poor ecological status. Palaeolimnological analysis of recent (last >200 yr) changes in organic and carbonate content, diatoms, stable isotopes, ostracods and molluscs in a lake sediment core (UL20A) indicates a 20th century trend towards increased sediment accumulation rates and eutrophication which was probably initiated by deforestation and agriculture. The most marked ecological shift occurs in the early 1960s, however. A subtle rise in diatom-inferred total phosphorus and an inferred reduction in submerged aquatic macrophyte cover accompanies a major increase in sediment accumulation rate. An associated marked shift in ostracod stable isotope data indicative of reduced seasonality and a change in hydrological input suggests major impact from artificial water management practices, all of which appears to have culminated in the sustained loss of submerged macrophytes since 2000. The study indicates it is vital to take both land-use and water management practices into account in devising restoration strategies. in a wider context, the results have important implications for the conservation of shallow karstic lakes, the functioning of which is still poorly understood. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reports three experiments that examine the role of similarity processing in McGeorge and Burton's (1990) incidental learning task. In the experiments subjects performed a distractor task involving four-digit number strings, all of which conformed to a simple hidden rule. They were then given a forced-choice memory test in which they were presented with pairs of strings and were led to believe that one string of each pair had appeared in the prior learning phase. Although this was not the case, one string of each pair did conform to the hidden rule. Experiment 1 showed that, as in the McGeorge and Burton study, subjects were significantly more likely to select test strings that conformed to the hidden rule. However, additional analyses suggested that rather than having implicitly abstracted the rule, subjects may have been selecting strings that were in some way similar to those seen during the learning phase. Experiments 2 and 3 were designed to try to separate out effects due to similarity from those due to implicit rule abstraction. It was found that the results were more consistent with a similarity-based model than implicit rule abstraction per se.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Grid workflow authoring tools are typically specific to particular workflow engines built into Grid middleware, or are application specific and are designed to interact with specific software implementations. g-Eclipse is a middleware independent Grid workbench that aims to provide a unified abstraction of the Grid and includes a Grid workflow builder to allow users to author and deploy workflows to the Grid. This paper describes the g-Eclipse Workflow Builder and its implementations for two Grid middlewares, gLite and GRIA, and a case study utilizing the Workflow Builder in a Grid user's scientific workflow deployment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time-resolved kinetic studies of the reaction of silylene, SiH2, generated by laser flash photolysis of both silacyclopent-3-ene and phenylsilane, have been carried out to obtain second-order rate constants for its reaction with CH3Cl. The reaction was studied in the gas phase at six temperatures in the range 294-606 K. The second-order rate constants gave a curved Arrhenius plot with a minimum value at T approximate to 370 K. The reaction showed no pressure dependence in the presence of up to 100 Torr SF6. The rate constants, however, showed a weak dependence on laser pulse energy. This suggests an interpretation requiring more than one contributing reaction pathway to SiH2 removal. Apart from a direct reaction of SiH2 with CH3Cl, reaction of SiH2 with CH3 (formed by photodissociation of CH3Cl) seems probable, with contributions of up to 30% to the rates. Ab initio calculations (G3 level) show that the initial step of reaction of SiH2 with CH3Cl is formation of a zwitterionic complex (ylid), but a high-energy barrier rules out the subsequent insertion step. On the other hand, the Cl-abstraction reaction leading to CH3 + ClSiH2 has a low barrier, and therefore, this seems the most likely candidate for the main reaction pathway of SiH2 with CH3Cl. RRKM calculations on the abstraction pathway show that this process alone cannot account for the observed temperature dependence of the rate constants. The data are discussed in light of studies of other silylene reactions with haloalkanes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Irradiation of argon matrices at 12 K containing hydrogen peroxide and tetrachloroethene using the output from a medium-pressure mercury lamp gives rise to the carbonyl compound trichloroacetyl chloride (CCl3CClO). Similarly trichloroethene gives dichloroacetyl chloride ( CCl2HCClO) - predominantly in the gauche form - under the same conditions. It appears that the reaction is initiated by homolysis of the O-O bond of H2O2 to give OH radicals, one of which adds to the double bond of an alkene molecule. The reaction then proceeds by abstraction of the H atom of the hydroxyl group and Cl-atom migration. This mechanism has been explored by the use of DFT calculations to back up the experimental findings. The mechanism is analogous to that shown by the simple hydrocarbon alkenes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The realisation that much of conventional. modern architecture is not sustainable over the long term is not new. Typical approaches are aimed at using energy and materials more efficiently. However, by clearly understanding the natural processes and their interactions with human needs in view, designers can create buildings that are delightful. functional productive and regenerative by design. The paper aims to review the biomimetics literature that is relevant to building materials and design. Biomimetics is the abstraction of good design from Nature, an enabling interdisciplinary science. particularly interested in emerging properties of materials and structures as a result of their hierarchical organisation. Biomimetics provides ideas relevant to: graded functionality of materials (nano-scale), adaptive response (nano-, micro-. and macro-scales): integrated intelligence (sensing and actuation at all scales), architecture and additional functionality. There are many examples in biology where emergent response of plants and animals to temperature, humidity and other changes in their physical environments is based on relatively simple physical principles. However, the implementation of design solutions which exploit these principles is where inspiration for man-made structures should be. We analyse specific examples of sustainability from Nature and the benefits or value that these solutions have brought to different creatures. By doing this, we appreciate how the natural world fits into the world of sustainable buildings and how as building engineers we can value its true application in delivering sustainable building.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fingerprinting is a well known approach for identifying multimedia data without having the original data present but what amounts to its essence or ”DNA”. Current approaches show insufficient deployment of three types of knowledge that could be brought to bear in providing a finger printing framework that remains effective, efficient and can accommodate both the whole as well as elemental protection at appropriate levels of abstraction to suit various Foci of Interest (FoI) in an image or cross media artefact. Thus our proposed framework aims to deliver selective composite fingerprinting that remains responsive to the requirements for protection of whole or parts of an image which may be of particularly interest and be especially vulnerable to attempts at rights violation. This is powerfully aided by leveraging both multi-modal information as well as a rich spectrum of collateral context knowledge including both image-level collaterals as well as the inevitably needed market intelligence knowledge such as customers’ social networks interests profiling which we can deploy as a crucial component of our Fingerprinting Collateral Knowledge. This is used in selecting the special FoIs within an image or other media content that have to be selectively and collaterally protected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Utilising the expressive power of S-Expressions in Learning Classifier Systems often prohibitively increases the search space due to increased flexibility of the endcoding. This work shows that selection of appropriate S-Expression functions through domain knowledge improves scaling in problems, as expected. It is also known that simple alphabets perform well on relatively small sized problems in a domain, e.g. ternary alphabet in the 6, 11 and 20 bit MUX domain. Once fit ternary rules have been formed it was investigated whether higher order learning was possible and whether this staged learning facilitated selection of appropriate functions in complex alphabets, e.g. selection of S-Expression functions. This novel methodology is shown to provide compact results (135-MUX) and exhibits potential for scaling well (1034-MUX), but is only a small step towards introducing abstraction to LCS.