21 resultados para Data Storage

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The accurate prediction of storms is vital to the oil and gas sector for the management of their operations. An overview of research exploring the prediction of storms by ensemble prediction systems is presented and its application to the oil and gas sector is discussed. The analysis method used requires larger amounts of data storage and computer processing time than other more conventional analysis methods. To overcome these difficulties eScience techniques have been utilised. These techniques potentially have applications to the oil and gas sector to help incorporate environmental data into their information systems

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The discourse surrounding the virtual has moved away from the utopian thinking accompanying the rise of the Internet in the 1990s. The Cyber-gurus of the last decades promised a technotopia removed from materiality and the confines of the flesh and the built environment, a liberation from old institutions and power structures. But since then, the virtual has grown into a distinct yet related sphere of cultural and political production that both parallels and occasionally flows over into the old world of material objects. The strict dichotomy of matter and digital purity has been replaced more recently with a more complex model where both the world of stuff and the world of knowledge support, resist and at the same time contain each other. Online social networks amplify and extend existing ones; other cultural interfaces like youtube have not replaced the communal experience of watching moving images in a semi-public space (the cinema) or the semi-private space (the family living room). Rather the experience of viewing is very much about sharing and communicating, offering interpretations and comments. Many of the web’s strongest entities (Amazon, eBay, Gumtree etc.) sit exactly at this juncture of applying tools taken from the knowledge management industry to organize the chaos of the material world along (post-)Fordist rationality. Since the early 1990s there have been many artistic and curatorial attempts to use the Internet as a platform of producing and exhibiting art, but a lot of these were reluctant to let go of the fantasy of digital freedom. Storage Room collapses the binary opposition of real and virtual space by using online data storage as a conduit for IRL art production. The artworks here will not be available for viewing online in a 'screen' environment but only as part of a downloadable package with the intention that the exhibition could be displayed (in a physical space) by any interested party and realised as ambitiously or minimally as the downloader wishes, based on their means. The artists will therefore also supply a set of instructions for the physical installation of the work alongside the digital files. In response to this curatorial initiative, File Transfer Protocol invites seven UK based artists to produce digital art for a physical environment, addressing the intersection between the virtual and the material. The files range from sound, video, digital prints and net art, blueprints for an action to take place, something to be made, a conceptual text piece, etc. About the works and artists: Polly Fibre is the pseudonym of London-based artist Christine Ellison. Ellison creates live music using domestic devices such as sewing machines, irons and slide projectors. Her costumes and stage sets propose a physical manifestation of the virtual space that is created inside software like Photoshop. For this exhibition, Polly Fibre invites the audience to create a musical composition using a pair of amplified scissors and a turntable. http://www.pollyfibre.com John Russell, a founding member of 1990s art group Bank, is an artist, curator and writer who explores in his work the contemporary political conditions of the work of art. In his digital print, Russell collages together visual representations of abstract philosophical ideas and transforms them into a post apocalyptic landscape that is complex and banal at the same time. www.john-russell.org The work of Bristol based artist Jem Nobel opens up a dialogue between the contemporary and the legacy of 20th century conceptual art around questions of collectivism and participation, authorship and individualism. His print SPACE concretizes the representation of the most common piece of Unicode: the vacant space between words. In this way, the gap itself turns from invisible cipher to sign. www.jemnoble.com Annabel Frearson is rewriting Mary Shelley's Frankenstein using all and only the words from the original text. Frankenstein 2, or the Monster of Main Stream, is read in parts by different performers, embodying the psychotic character of the protagonist, a mongrel hybrid of used language. www.annabelfrearson.com Darren Banks uses fragments of effect laden Holywood films to create an impossible space. The fictitious parts don't add up to a convincing material reality, leaving the viewer with a failed amalgamation of simulations of sophisticated technologies. www.darrenbanks.co.uk FIELDCLUB is collaboration between artist Paul Chaney and researcher Kenna Hernly. Chaney and Hernly developed together a project that critically examines various proposals for the management of sustainable ecological systems. Their FIELDMACHINE invites the public to design an ideal agricultural field. By playing with different types of crops that are found in the south west of England, it is possible for the user, for example, to create a balanced, but protein poor, diet or to simply decide to 'get rid' of half the population. The meeting point of the Platonic field and it physical consequences, generates a geometric abstraction that investigates the relationship between modernist utopianism and contemporary actuality. www.fieldclub.co.uk Pil and Galia Kollectiv, who have also curated the exhibition are London-based artists and run the xero, kline & coma gallery. Here they present a dialogue between two computers. The conversation opens with a simple text book problem in business studies. But gradually the language, mimicking the application of game theory in the business sector, becomes more abstract. The two interlocutors become adversaries trapped forever in a competition without winners. www.kollectiv.co.uk

Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is generally assumed that the variability of neuronal morphology has an important effect on both the connectivity and the activity of the nervous system, but this effect has not been thoroughly investigated. Neuroanatomical archives represent a crucial tool to explore structure–function relationships in the brain. We are developing computational tools to describe, generate, store and render large sets of three–dimensional neuronal structures in a format that is compact, quantitative, accurate and readily accessible to the neuroscientist. Single–cell neuroanatomy can be characterized quantitatively at several levels. In computer–aided neuronal tracing files, a dendritic tree is described as a series of cylinders, each represented by diameter, spatial coordinates and the connectivity to other cylinders in the tree. This ‘Cartesian’ description constitutes a completely accurate mapping of dendritic morphology but it bears little intuitive information for the neuroscientist. In contrast, a classical neuroanatomical analysis characterizes neuronal dendrites on the basis of the statistical distributions of morphological parameters, e.g. maximum branching order or bifurcation asymmetry. This description is intuitively more accessible, but it only yields information on the collective anatomy of a group of dendrites, i.e. it is not complete enough to provide a precise ‘blueprint’ of the original data. We are adopting a third, intermediate level of description, which consists of the algorithmic generation of neuronal structures within a certain morphological class based on a set of ‘fundamental’, measured parameters. This description is as intuitive as a classical neuroanatomical analysis (parameters have an intuitive interpretation), and as complete as a Cartesian file (the algorithms generate and display complete neurons). The advantages of the algorithmic description of neuronal structure are immense. If an algorithm can measure the values of a handful of parameters from an experimental database and generate virtual neurons whose anatomy is statistically indistinguishable from that of their real counterparts, a great deal of data compression and amplification can be achieved. Data compression results from the quantitative and complete description of thousands of neurons with a handful of statistical distributions of parameters. Data amplification is possible because, from a set of experimental neurons, many more virtual analogues can be generated. This approach could allow one, in principle, to create and store a neuroanatomical database containing data for an entire human brain in a personal computer. We are using two programs, L–NEURON and ARBORVITAE, to investigate systematically the potential of several different algorithms for the generation of virtual neurons. Using these programs, we have generated anatomically plausible virtual neurons for several morphological classes, including guinea pig cerebellar Purkinje cells and cat spinal cord motor neurons. These virtual neurons are stored in an online electronic archive of dendritic morphology. This process highlights the potential and the limitations of the ‘computational neuroanatomy’ strategy for neuroscience databases.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper reviews the literature concerning the practice of using Online Analytical Processing (OLAP) systems to recall information stored by Online Transactional Processing (OLTP) systems. Such a review provides a basis for discussion on the need for the information that are recalled through OLAP systems to maintain the contexts of transactions with the data captured by the respective OLTP system. The paper observes an industry trend involving the use of OLTP systems to process information into data, which are then stored in databases without the business rules that were used to process information and data stored in OLTP databases without associated business rules. This includes the necessitation of a practice, whereby, sets of business rules are used to extract, cleanse, transform and load data from disparate OLTP systems into OLAP databases to support the requirements for complex reporting and analytics. These sets of business rules are usually not the same as business rules used to capture data in particular OLTP systems. The paper argues that, differences between the business rules used to interpret these same data sets, risk gaps in semantics between information captured by OLTP systems and information recalled through OLAP systems. Literature concerning the modeling of business transaction information as facts with context as part of the modelling of information systems were reviewed to identify design trends that are contributing to the design quality of OLTP and OLAP systems. The paper then argues that; the quality of OLTP and OLAP systems design has a critical dependency on the capture of facts with associated context, encoding facts with contexts into data with business rules, storage and sourcing of data with business rules, decoding data with business rules into the facts with the context and recall of facts with associated contexts. The paper proposes UBIRQ, a design model to aid the co-design of data with business rules storage for OLTP and OLAP purposes. The proposed design model provides the opportunity for the implementation and use of multi-purpose databases, and business rules stores for OLTP and OLAP systems. Such implementations would enable the use of OLTP systems to record and store data with executions of business rules, which will allow for the use of OLTP and OLAP systems to query data with business rules used to capture the data. Thereby ensuring information recalled via OLAP systems preserves the contexts of transactions as per the data captured by the respective OLTP system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two-dimensional flood inundation modelling is a widely used tool to aid flood risk management. In urban areas, where asset value and population density are greatest, the model spatial resolution required to represent flows through a typical street network (i.e. < 10m) often results in impractical computational cost at the whole city scale. Explicit diffusive storage cell models become very inefficient at such high resolutions, relative to shallow water models, because the stable time step in such schemes scales as a quadratic of resolution. This paper presents the calibration and evaluation of a recently developed new formulation of the LISFLOOD-FP model, where stability is controlled by the Courant–Freidrichs–Levy condition for the shallow water equations, such that, the stable time step instead scales linearly with resolution. The case study used is based on observations during the summer 2007 floods in Tewkesbury, UK. Aerial photography is available for model evaluation on three separate days from the 24th to the 31st of July. The model covered a 3.6 km by 2 km domain and was calibrated using gauge data from high flows during the previous month. The new formulation was benchmarked against the original version of the model at 20 m and 40 m resolutions, demonstrating equally accurate performance given the available validation data but at 67x faster computation time. The July event was then simulated at the 2 m resolution of the available airborne LiDAR DEM. This resulted in a significantly more accurate simulation of the drying dynamics compared to that simulated by the coarse resolution models, although estimates of peak inundation depth were similar.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review summarizes the recent discovery of the cupin superfamily (from the Latin term "cupa," a small barrel) of functionally diverse proteins that initially were limited to several higher plant proteins such as seed storage proteins, germin (an oxalate oxidase), germin-like proteins, and auxin-binding protein. Knowledge of the three-dimensional structure of two vicilins, seed proteins with a characteristic beta-barrel core, led to the identification of a small number of conserved residues and thence to the discovery of several microbial proteins which share these key amino acids. In particular, there is a highly conserved pattern of two histidine-containing motifs with a varied intermotif spacing. This cupin signature is found as a central component of many microbial proteins including certain types of phosphomannose isomerase, polyketide synthase, epimerase, and dioxygenase. In addition, the signature has been identified within the N-terminal effector domain in a subgroup of bacterial AraC transcription factors. As well as these single-domain cupins, this survey has identified other classes of two-domain bicupins including bacterial gentisate 1, 2-dioxygenases and 1-hydroxy-2-naphthoate dioxygenases, fungal oxalate decarboxylases, and legume sucrose-binding proteins. Cupin evolution is discussed from the perspective of the structure-function relationships, using data from the genomes of several prokaryotes, especially Bacillus subtilis. Many of these functions involve aspects of sugar metabolism and cell wall synthesis and are concerned with responses to abiotic stress such as heat, desiccation, or starvation. Particular emphasis is also given to the oxalate-degrading enzymes from microbes, their biological significance, and their value in a range of medical and other applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to evaluate the survivability of Bifidobacterium breve NCIMB 702257 in a three malt-based media supplemented with cysteine and yeast extract, and to determine the protective effect of these growth factors. A number of parameterised mathematical models were used to predict of kinetics of viability and total acidity during storage at different temperatures. Results demonstrated a good fit to the experimental mathematical model. The Arrhenius equations showed only reasonable fits and the polynomial plots contained a large area without data between 4 and 25 degrees C. In addition, it was shown that cysteine promotes growth and acid production by bifidobacteria, but does not extend survivability. On the other hand, increasing the yeast extract content of the fermentation media enhances the survivability of B. breve. To our knowledge, this is the first study to address the modelling of the survivability of probiotic bacteria in a cereal based fermentation media at different temperatures, introducing a more quantitative approach to the study of the shelf-life of a probiotic product. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The iRODS system, created by the San Diego Supercomputing Centre, is a rule oriented data management system that allows the user to create sets of rules to define how the data is to be managed. Each rule corresponds to a particular action or operation (such as checksumming a file) and the system is flexible enough to allow the user to create new rules for new types of operations. The iRODS system can interface to any storage system (provided an iRODS driver is built for that system) and relies on its’ metadata catalogue to provide a virtual file-system that can handle files of any size and type. However, some storage systems (such as tape systems) do not handle small files efficiently and prefer small files to be packaged up (or “bundled”) into larger units. We have developed a system that can bundle small data files of any type into larger units - mounted collections. The system can create collection families and contains its’ own extensible metadata, including metadata on which family the collection belongs to. The mounted collection system can work standalone and is being incorporated into the iRODS system to enhance the systems flexibility to handle small files. In this paper we describe the motivation for creating a mounted collection system, its’ architecture and how it has been incorporated into the iRODS system. We describe different technologies used to create the mounted collection system and provide some performance numbers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a prototype grid infrastructure, called the eMinerals minigrid, for molecular simulation scientists. which is based on an integration of shared compute and data resources. We describe the key components, namely the use of Condor pools, Linux/Unix clusters with PBS and IBM's LoadLeveller job handling tools, the use of Globus for security handling, the use of Condor-G tools for wrapping globus job submit commands, Condor's DAGman tool for handling workflow, the Storage Resource Broker for handling data, and the CCLRC dataportal and associated tools for both archiving data with metadata and making data available to other workers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Burst timing synchronisation is maintained in a digital data decoder during multiple burst reception in a TDMA system. The data within a multiple burst are streamed into memory storage and data corresponding to a first burst in the series of bursts are selected on the basis of a current timing estimate derived from a synchronisation burst. Selections of data corresponding to other bursts in the series of bursts are modified in accordance with updated timing estimates derived from previously processed bursts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reduced flexibility of low carbon generation could pose new challenges for future energy systems. Both demand response and distributed storage may have a role to play in supporting future system balancing. This paper reviews how these technically different, but functionally similar approaches compare and compete with one another. Household survey data is used to test the effectiveness of price signals to deliver demand responses for appliances with a high degree of agency. The underlying unit of storage for different demand response options is discussed, with particular focus on the ability to enhance demand side flexibility in the residential sector. We conclude that a broad range of options, with different modes of storage, may need to be considered, if residential demand flexibility is to be maximised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

What are the precise brain regions supporting the short-term retention of verbal information? A previous functional magnetic resonance imaging (fMRI) study suggested that they may be topographically variable across individuals, occurring, in most, in regions posterior to prefrontal cortex (PFC), and that detection of these regions may be best suited to a single-subject (SS) approach to fMRI analysis (Feredoes and Postle, 2007). In contrast, other studies using spatially normalized group-averaged (SNGA) analyses have localized storage-related activity to PFC. To evaluate the necessity of the regions identified by these two methods, we applied repetitive transcranial magnetic stimulation (rTMS) to SS- and SNGA-identified regions throughout the retention period of a delayed letter-recognition task. Results indicated that rTMS targeting SS analysis-identified regions of left perisylvian and sensorimotor cortex impaired performance, whereas rTMS targeting the SNGA-identified region of left caudal PFC had no effect on performance. Our results support the view that the short-term retention of verbal information can be supported by regions associated with acoustic, lexical, phonological, and speech-based representation of information. They also suggest that the brain bases of some cognitive functions may be better detected by SS than by SNGA approaches to fMRI data analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract: During the transition from endo-dormancy to eco-dormancy and subsequent growth, the onion bulb undergoes the transition from sink organ to source, to sustain cell division in the meristematic tissue. The mechanisms controlling these processes are not fully understood. Here, a detailed analysis of whole onion bulb physiological, biochemical and transcriptional changes in response to sprouting is reported, enabling a better knowledge of the mechanisms regulating post-harvest onion sprout development. Biochemical and physiological analyses were conducted on different cultivars ('Wellington', 'Sherpa' and 'Red Baron') grown at different sites over 3 years, cured at different temperatures (20, 24 and 28 degrees C) and stored under different regimes (1, 3, 6 and 6 1 degrees C). In addition, the first onion oligonucleotide microarray was developed to determine differential gene expression in onion during curing and storage, so that transcriptional changes could support biochemical and physiological analyses. There were greater transcriptional differences between samples at harvest and before sprouting than between the samples taken before and after sprouting, with some significant changes occurring during the relatively short curing period. These changes are likely to represent the transition from endo-dormancy to sprout suppression, and suggest that endo-dormancy is a relatively short period ending just after curing. Principal component analysis of biochemical and physiological data identified the ratio of monosaccharides (fructose and glucose) to disaccharide (sucrose), along with the concentration of zeatin riboside, as important factors in discriminating between sprouting and pre-sprouting bulbs. These detailed analyses provide novel insights into key regulatory triggers for sprout dormancy release in onion bulbs and provide the potential for the development of biochemical or transcriptional markers for sprout initiation. Evidence presented herein also suggests there is no detrimental effect on bulb storage life and quality caused by curing at 20 degrees C, producing a considerable saving in energy and costs.