891 resultados para pacs: information storage and retrieval
Resumo:
The long-term stability, high accuracy, all-weather capability, high vertical resolution, and global coverage of Global Navigation Satellite System (GNSS) radio occultation (RO) suggests it as a promising tool for global monitoring of atmospheric temperature change. With the aim to investigate and quantify how well a GNSS RO observing system is able to detect climate trends, we are currently performing an (climate) observing system simulation experiment over the 25-year period 2001 to 2025, which involves quasi-realistic modeling of the neutral atmosphere and the ionosphere. We carried out two climate simulations with the general circulation model MAECHAM5 (Middle Atmosphere European Centre/Hamburg Model Version 5) of the MPI-M Hamburg, covering the period 2001–2025: One control run with natural variability only and one run also including anthropogenic forcings due to greenhouse gases, sulfate aerosols, and tropospheric ozone. On the basis of this, we perform quasi-realistic simulations of RO observables for a small GNSS receiver constellation (six satellites), state-of-the-art data processing for atmospheric profiles retrieval, and a statistical analysis of temperature trends in both the “observed” climatology and the “true” climatology. Here we describe the setup of the experiment and results from a test bed study conducted to obtain a basic set of realistic estimates of observational errors (instrument- and retrieval processing-related errors) and sampling errors (due to spatial-temporal undersampling). The test bed results, obtained for a typical summer season and compared to the climatic 2001–2025 trends from the MAECHAM5 simulation including anthropogenic forcing, were found encouraging for performing the full 25-year experiment. They indicated that observational and sampling errors (both contributing about 0.2 K) are consistent with recent estimates of these errors from real RO data and that they should be sufficiently small for monitoring expected temperature trends in the global atmosphere over the next 10 to 20 years in most regions of the upper troposphere and lower stratosphere (UTLS). Inspection of the MAECHAM5 trends in different RO-accessible atmospheric parameters (microwave refractivity and pressure/geopotential height in addition to temperature) indicates complementary climate change sensitivity in different regions of the UTLS so that optimized climate monitoring shall combine information from all climatic key variables retrievable from GNSS RO data.
Resumo:
The accurate prediction of storms is vital to the oil and gas sector for the management of their operations. An overview of research exploring the prediction of storms by ensemble prediction systems is presented and its application to the oil and gas sector is discussed. The analysis method used requires larger amounts of data storage and computer processing time than other more conventional analysis methods. To overcome these difficulties eScience techniques have been utilised. These techniques potentially have applications to the oil and gas sector to help incorporate environmental data into their information systems
Resumo:
Most priming studies have been conducted on commercial seed lots of unspecified uniformity and maturity, and subsequent seed longevity has been reported to both increase and decrease. Here a seed lot of Digitalis purpurea L. with relatively uniform maturity and known history was used to analyse the effects of priming on seed longevity in air-dry storage. Seeds collected close to natural dispersal and dried at 15 % relative humidity (RH), 15 degrees C, were placed into experimental storage (60 % RH, 45 degrees C) for 14 or 28 d, primed for 48 h at 0, -1, -2, -5, -10 or -15 MPa, re-equilibrated (47 % RH, 20 degrees C) and then returned to storage. Further seed samples were primed for 2 or 48 h at -1 MPa and either dried at 15 % RH, 15 degrees C or immediately re-equilibrated for experimental storage. Finally, some seeds were given up to three cycles of experimental storage and priming (48 h at -1 MPa). Priming at -1 MPa had a variable effect on subsequent survival during experimental storage. The shortest lived seeds in the control population showed slightly increased life spans; the longer lived seeds showed reduced life spans. In contrast, seeds first stored for 14 or 28 d before priming had substantially increased life spans. The increase tended to be greatest in the shortest lived fraction of the seed population. Both the period of rehydration and the subsequent drying conditions had significant effects on longevity. Interrupting air-dry storage with additional cycles of priming also increased longevity. The extent of prior deterioration and the post-priming desiccation environment affect the benefits of priming to the subsequent survival of mature seeds. Rehydration-dehydration treatments may have potential as an adjunct or alternative to the regeneration of seed accessions maintained in gene banks for plant biodiversity conservation or plant breeding.
Resumo:
Seed of 15 species of Brassicaceae were stored hermetically in a genebank (at -5 degrees C to -10 degrees C with c. 3% moisture content) for 40 years. Samples were withdrawn at intervals for germination tests. Many accessions showed an increase in ability to germinate over this period. due to loss in dormancy. Nevertheless, some dormancy remained after 40 years' storage and was broken by pre-applied gibberellic acid. The poorest seed survival occurred in Hormatophylla spinosa. Even in this accession the ability to germinate declined by only 7% between 1966 and 2006. Comparison of seeds from 1966 stored for 40 years with those collected anew in 2006 from the original sampling sites, where possible, showed few differences, other than a tendency (7 of 9 accessions) for the latter to show greater dormancy. These results for hermetic storage at sub-zero temperatures and low moisture contents confirm that long-term seed storage can provide a successful technology for ex situ plant biodiversity conservation.
Resumo:
The aims of this study were to explore the environmental factors that determine the distribution of plant communities in temporary rock pools and provide a quantitative analysis of vegetation-environment relationships for five study sites on the island of Gavdos, southwest of Crete, Greece. Data from 99 rock pools were collected and analysed using Two-Way Indicator Species Analysis (TWINSPAN), Detrended Correspondence Analysis (DCA) and Canonical Correspondence Analysis (CCA) to identify the principal communities and environmental gradients that are linked to community distribution. A total of 46 species belonging to 21 families were recorded within the study area. The dominant families were Labiatae, Gramineae and Compositae while therophytes and chamaephytes were the most frequent life forms. The samples were classified into six community types using TWINSPAN, which were also corroborated by CCA analysis. The principal gradients for vegetation distribution, identified by CCA, were associated with water storage and water retention ability, as expressed by pool perimeter and water depth. Generalised Additive Models (GAMs) were employed to identify responses of four dominant rock pool species to water depth. The resulting species response curves showed niche differentiation in the cases of Callitriche pulchra and Tillaea vaillantii and revealed competition between Zannichellia pedunculata and Chara vulgaris. The use of classification in combination with ordination techniques resulted in a good discrimination between plant communities. Generalised Additive Models are a powerful tool in investigating species response curves to environmental gradients. The methodology adopted can be employed for improving baseline information on plant community ecology and distribution in Mediterranean ephemeral pools.
Resumo:
The aim of this study was to determine the support and information needs of older and disabled older people in the UK. Following an initial literature survey, an examination of data on enquiries made by older people to information providers, and a series of focus groups, a questionnaire was developed for a nationwide survey. Over 1630 questionnaires were completed by disabled older clients of Day Care Centres and less frail older members of social clubs. Findings showed that there is a serious shortfall in the number of older people getting the practical support that they need, and the information that enables access to this support, compared to the number that actually need help. Substantial percentages of the survey respondents experienced difficulty with everyday tasks and with accessing the information they needed. Implications for formal sources of support and information are discussed.
Resumo:
There are a number of challenges associated with managing knowledge and information in construction organizations delivering major capital assets. These include the ever-increasing volumes of information, losing people because of retirement or competitors, the continuously changing nature of information, lack of methods on eliciting useful knowledge, development of new information technologies and changes in management and innovation practices. Existing tools and methodologies for valuing intangible assets in fields such as engineering, project management and financial, accounting, do not address fully the issues associated with the valuation of information and knowledge. Information is rarely recorded in a way that a document can be valued, when either produced or subsequently retrieved and re-used. In addition there is a wealth of tacit personal knowledge which, if codified into documentary information, may prove to be very valuable to operators of the finished asset or future designers. This paper addresses the problem of information overload and identifies the differences between data, information and knowledge. An exploratory study was conducted with a leading construction consultant examining three perspectives (business, project management and document management) by structured interviews and specifically how to value information in practical terms. Major challenges in information management are identified. An through-life Information Evaluation methodology (IEM) is presented to reduce information overload and to make the information more valuable in the future.
Resumo:
Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society
Resumo:
Much consideration is rightly given to the design of metadata models to describe data. At the other end of the data-delivery spectrum much thought has also been given to the design of geospatial delivery interfaces such as the Open Geospatial Consortium standards, Web Coverage Service (WCS), Web Map Server and Web Feature Service (WFS). Our recent experience with the Climate Science Modelling Language shows that an implementation gap exists where many challenges remain unsolved. To bridge this gap requires transposing information and data from one world view of geospatial climate data to another. Some of the issues include: the loss of information in mapping to a common information model, the need to create ‘views’ onto file-based storage, and the need to map onto an appropriate delivery interface (as with the choice between WFS and WCS for feature types with coverage-valued properties). Here we summarise the approaches we have taken in facing up to these problems.
Resumo:
Undeniably, anticipation plays a crucial role in cognition. By what means, to what extent, and what it achieves remain open questions. In a recent BBS target article, Clark (in press) depicts an integrative model of the brain that builds on hierarchical Bayesian models of neural processing (Rao and Ballard, 1999; Friston, 2005; Brown et al., 2011), and their most recent formulation using the free-energy principle borrowed from thermodynamics (Feldman and Friston, 2010; Friston, 2010; Friston et al., 2010). Hierarchical generative models of cognition, such as those described by Clark, presuppose the manipulation of representations and internal models of the world, in as much detail as is perceptually available. Perhaps surprisingly, Clark acknowledges the existence of a “virtual version of the sensory data” (p. 4), but with no reference to some of the historical debates that shaped cognitive science, related to the storage, manipulation, and retrieval of representations in a cognitive system (Shanahan, 1997), or accounting for the emergence of intentionality within such a system (Searle, 1980; Preston and Bishop, 2002). Instead of demonstrating how this Bayesian framework responds to these foundational questions, Clark describes the structure and the functional properties of an action-oriented, multi-level system that is meant to combine perception, learning, and experience (Niedenthal, 2007).
Resumo:
The three decades of on-going executives’ concerns of how to achieve successful alignment between business and information technology shows the complexity of such a vital process. Most of the challenges of alignment are related to knowledge and organisational change and several researchers have introduced a number of mechanisms to address some of these challenges. However, these mechanisms pay less attention to multi-level effects, which results in a limited un-derstanding of alignment across levels. Therefore, we reviewed these challenges from a multi-level learning perspective and found that business and IT alignment is related to the balance of exploitation and exploration strategies with the intellec-tual content of individual, group and organisational levels.
Resumo:
Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.
Resumo:
The more information is available, and the more predictable are events, the better forecasts ought to be. In this paper forecasts by bookmakers, prediction markets and tipsters are evaluated for a range of events with varying degrees of predictability and information availability. All three types of forecast represent different structures of information processing and as such would be expected to perform differently. By and large, events that are more predictable, and for which more information is available, do tend to be forecast better.
Resumo:
This paper is about the use of natural language to communicate with computers. Most researches that have pursued this goal consider only requests expressed in English. A way to facilitate the use of several languages in natural language systems is by using an interlingua. An interlingua is an intermediary representation for natural language information that can be processed by machines. We propose to convert natural language requests into an interlingua [universal networking language (UNL)] and to execute these requests using software components. In order to achieve this goal, we propose OntoMap, an ontology-based architecture to perform the semantic mapping between UNL sentences and software components. OntoMap also performs component search and retrieval based on semantic information formalized in ontologies and rules.