875 resultados para ENVIRONMENT DATA
Resumo:
ISO19156 Observations and Measurements (O&M) provides a standardised framework for organising information about the collection of information about the environment. Here we describe the implementation of a specialisation of O&M for environmental data, the Metadata Objects for Linking Environmental Sciences (MOLES3). MOLES3 provides support for organising information about data, and for user navigation around data holdings. The implementation described here, “CEDA-MOLES”, also supports data management functions for the Centre for Environmental Data Archival, CEDA. The previous iteration of MOLES (MOLES2) saw active use over five years, being replaced by CEDA-MOLES in late 2014. During that period important lessons were learnt both about the information needed, as well as how to design and maintain the necessary information systems. In this paper we review the problems encountered in MOLES2; how and why CEDA-MOLES was developed and engineered; the migration of information holdings from MOLES2 to CEDA-MOLES; and, finally, provide an early assessment of MOLES3 (as implemented in CEDA-MOLES) and its limitations. Key drivers for the MOLES3 development included the necessity for improved data provenance, for further structured information to support ISO19115 discovery metadata export (for EU INSPIRE compliance), and to provide appropriate fixed landing pages for Digital Object Identifiers (DOIs) in the presence of evolving datasets. Key lessons learned included the importance of minimising information structure in free text fields, and the necessity to support as much agility in the information infrastructure as possible without compromising on maintainability both by those using the systems internally and externally (e.g. citing in to the information infrastructure), and those responsible for the systems themselves. The migration itself needed to ensure continuity of service and traceability of archived assets.
Resumo:
Sea surface temperature (SST) data are often provided as gridded products, typically at resolutions of order 0.05 degrees from satellite observations to reduce data volume at the request of data users and facilitate comparison against other products or models. Sampling uncertainty is introduced in gridded products where the full surface area of the ocean within a grid cell cannot be fully observed because of cloud cover. In this paper we parameterise uncertainties in SST as a function of the percentage of clear-sky pixels available and the SST variability in that subsample. This parameterisation is developed from Advanced Along Track Scanning Radiometer (AATSR) data, but is applicable to all gridded L3U SST products at resolutions of 0.05-0.1 degrees, irrespective of instrument and retrieval algorithm, provided that instrument noise propagated into the SST is accounted for. We also calculate the sampling uncertainty of ~0.04 K in Global Area Coverage (GAC) Advanced Very High Resolution Radiometer (AVHRR) products, using related methods.
Resumo:
Searching for and mapping the physical extent of unmarked graves using geophysical techniques has proven difficult in many cases. The success of individual geophysical techniques for detecting graves depends on a site-by-site basis. Significantly, detection of graves often results from measured contrasts that are linked to the background soils rather than the type of archaeological feature associated with the grave. It is evident that investigation of buried remains should be considered within a 3D space as the variation in burial environment can be extremely varied through the grave. Within this paper, we demonstrate the need for a multi-method survey strategy to investigate unmarked graves, as applied at a “planned” but unmarked pauper’s cemetery. The outcome from this case study provides new insights into the strategy that is required at such sites. Perhaps the most significant conclusion is that unmarked graves are best understood in terms of characterization rather than identification. In this paper, we argue for a methodological approach that, while following the current trends to use multiple techniques, is fundamentally dependent on a structured approach to the analysis of the data. The ramifications of this case study illustrate the necessity of an integrated strategy to provide a more holistic understanding of unmarked graves that may help aid in management of these unseen but important aspects of our heritage. It is concluded that the search for graves is still a current debate and one that will be solved by methodological rather than technique-based arguments.
Resumo:
Buildings consume a large amount of energy, in both their use and production. Retrofitting aims to achieve a reduction in this energy consumption. However, there are concerns that retrofitting can cause negative impacts on the internal environment including poor thermal comfort and health issues. This research investigates the impact of retrofitting the façade of existing traditional buildings and the resulting impact on the indoor environment and occupant thermal comfort. A Case building located at the University of Reading has been monitored experimentally and modelled using IES software with monitored values as input conditions for the model. The proposed façade related retrofit options have been simulated and provide information on their effect on the indoor environment. The findings show a positive impact on the internal environment. The data shows a 16.2% improvement in thermal comfort after retrofit is simulated. This also achieved a 21.6% reduction in energy consumption from the existing building.
Resumo:
Housing Associations (HAs) contribute circa 20% of the UK’s housing supply. HAs are however under increasing pressure as a result of funding cuts and rent reductions. Due to the increased pressure, a number of processes are currently being reviewed by HAs, especially how they manage and learn from defects. Learning from defects is considered a useful approach to achieving defect reduction within the UK housebuilding industry. This paper contributes to our understanding of how HAs learn from defects by undertaking an initial round table discussion with key HA stakeholders as part of an ongoing collaborative research project with the National House Building Council (NHBC) to better understand how house builders and HAs learn from defects to reduce their prevalence. The initial discussion shows that defect information runs through a number of groups, both internal and external of a HA during both the defects management process and organizational learning (OL) process. Furthermore, HAs are reliant on capturing and recording defect data as the foundation for the OL process. During the OL process defect data analysis is the primary enabler to recognizing a need for a change to organizational routines. When a need for change has been recognized, new options are typically pursued to design out defects via updates to a HAs Employer’s Requirements. Proposed solutions are selected by a review board and committed to organizational routine. After implementing a change, both structured and unstructured feedback is sought to establish the change’s success. The findings from the HA discussion demonstrates that OL can achieve defect reduction within the house building sector in the UK. The paper concludes by outlining a potential ‘learning from defects model’ for the housebuilding industry as well as describing future work.
Resumo:
Wireless Sensor Networks (WSNs) have been an exciting topic in recent years. The services offered by a WSN can be classified into three major categories: monitoring, alerting, and information on demand. WSNs have been used for a variety of applications related to the environment (agriculture, water and forest fire detection), the military, buildings, health (elderly people and home monitoring), disaster relief, and area or industrial monitoring. In most WSNs tasks like processing the sensed data, making decisions and generating emergency messages are carried out by a remote server, hence the need for efficient means of transferring data across the network. Because of the range of applications and types of WSN there is a need for different kinds of MAC and routing protocols in order to guarantee delivery of data from the source nodes to the server (or sink). In order to minimize energy consumption and increase performance in areas such as reliability of data delivery, extensive research has been conducted and documented in the literature on designing energy efficient protocols for each individual layer. The most common way to conserve energy in WSNs involves using the MAC layer to put the transceiver and the processor of the sensor node into a low power, sleep state when they are not being used. Hence the energy wasted due to collisions, overhearing and idle listening is reduced. As a result of this strategy for saving energy, the routing protocols need new solutions that take into account the sleep state of some nodes, and which also enable the lifetime of the entire network to be increased by distributing energy usage between nodes over time. This could mean that a combined MAC and routing protocol could significantly improve WSNs because the interaction between the MAC and network layers lets nodes be active at the same time in order to deal with data transmission. In the research presented in this thesis, a cross-layer protocol based on MAC and routing protocols was designed in order to improve the capability of WSNs for a range of different applications. Simulation results, based on a range of realistic scenarios, show that these new protocols improve WSNs by reducing their energy consumption as well as enabling them to support mobile nodes, where necessary. A number of conference and journal papers have been published to disseminate these results for a range of applications.
Resumo:
Introduction Human immunodeficiency virus (HIV) is a serious disease which can be associated with various activity limitations and participation restrictions. The aim of this paper was to describe how HIV affects the functioning and health of people within different environmental contexts, particularly with regard to access to medication. Method Four cross-sectional studies, three in South Africa and one in Brazil, had applied the International Classification of Functioning, Disability and Health (ICF) as a classification instrument to participants living with HIV. Each group was at a different stage of the disease. Only two groups had had continuing access to antiretroviral therapy. The existence of these descriptive sets enabled comparison of the disability experienced by people living with HIV at different stages of the disease and with differing access to antiretroviral therapy. Results Common problems experienced in all groups related to weight maintenance, with two-thirds of the sample reporting problems in this area. Mental functions presented the most problems in all groups, with sleep (50%, 92/185), energy and drive (45%, 83/185), and emotional functions (49%, 90/185) being the most affected. In those on long-term therapy, body image affected 93% (39/42) and was a major problem. The other groups reported pain as a problem, and those with limited access to treatment also reported mobility problems. Cardiopulmonary functions were affected in all groups. Conclusion Functional problems occurred in the areas of impairment and activity limitation in people at advanced stages of HIV, and more limitations occurred in the area of participation for those on antiretroviral treatment. The ICF provided a useful framework within which to describe the functioning of those with HIV and the impact of the environment. Given the wide spectrum of problems found, consideration could be given to a number of ICF core sets that are relevant to the different stages of HIV disease. (C) 2010 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.
Resumo:
The environment where galaxies are found heavily influences their evolution. Close groupings, like the ones in the cores of galaxy clusters or compact groups, evolve in ways far more dramatic than their isolated counterparts. We have conducted a multi-wavelength study of Hickson Compact Group 7 (HCG 7), consisting of four giant galaxies: three spirals and one lenticular. We use Hubble Space Telescope (HST) imaging to identify and characterize the young and old star cluster populations. We find young massive clusters (YMCs) mostly in the three spirals, while the lenticular features a large, unimodal population of globular clusters (GCs) but no detectable clusters with ages less than a few Gyr. The spatial and approximate age distributions of the similar to 300 YMCs and similar to 150 GCs thus hint at a regular star formation history in the group over a Hubble time. While at first glance the HST data show the galaxies as undisturbed, our deep ground-based, wide-field imaging that extends the HST coverage reveals faint signatures of stellar material in the intragroup medium (IGM). We do not, however, detect the IGM in H I or Chandra X-ray observations, signatures that would be expected to arise from major mergers. Despite this fact, we find that the H I gas content of the individual galaxies and the group as a whole are a third of the expected abundance. The appearance of quiescence is challenged by spectroscopy that reveals an intense ionization continuum in one galaxy nucleus, and post-burst characteristics in another. Our spectroscopic survey of dwarf galaxy members yields a single dwarf elliptical galaxy in an apparent stellar tidal feature. Based on all this information, we suggest an evolutionary scenario for HCG 7, whereby the galaxies convert most of their available gas into stars without the influence of major mergers and ultimately result in a dry merger. As the conditions governing compact groups are reminiscent of galaxies at intermediate redshift, we propose that HCGs are appropriate for studying galaxy evolution at z similar to 1-2.
Resumo:
The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Lycopodiopsis derbyi Renault was analyzed on the basis of compressed silicified stems from four Guadalupian outcrops of the Parana Basin (Corumbatai Formation) in the State of Sao Paulo, Southern Brazil. Dichotomous stems have been recorded, and three different branch regions related to apoxogenesis are described. The most proximal region has larger, clearly rhomboidal leaf cushions, with protruding upper edges; the intermediate transitional region also has rhombic leaf cushions, but they are smaller and less elongated than the lower in the same axis; finally, the most distal region reveals only incipient cushions, with inconspicuous infrafoliar bladders; interspersed microphylls were still attached. A well preserved branch representative of this most distal region was sectioned; it has a siphonostelic cylinder similar to that previously described for L derbyi. The cortex, however, shows new traits, such as a short portion of elongated cells between the periderm and the external cortex (or leaf cushion tissue). The stems were apparently silicified prior to their final burial but were probably not transported for long distances. Their final burial may have taken place during storm events, which were common during the deposition of the Corumbatai Formation. These stems are commonly deformed due to compression, mainly because the internal cortical portions rapidly decayed prior to silicification due to their thin-walled tissue, and are therefore not preserved. The common alkalinity of a shallow marine environment such as that in which the Corumbatai Formation was deposited, should mobilize the silica and favors petrifaction. Based on the new data, an emended diagnosis is proposed and a modification of the identification key published by Thomas and Meyen in 1984 for Upper Paleozoic Lycopsida is suggested. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The present paper discusses two pilot studies carried out to see the possibility of the fan community of manga (Japanese comics), in which fan translators translate the original Japanese manga into English (which is called scanlation), functioning as an informal learning environment for the Japanese language learning and translator training. Two pilot studies consist of a) comparison of the original Japanese version with the scanlation and official translation, and b) comparison of the original Japanese version with two different versions of scanlation to see the translators’ level of Japanese language and the overall translation quality. The results show that in scanlation versions, there were numbers of inaccuracies which would prevent them to be treated as professional translation. Some of these errors are clearly caused by insufficient understanding of Japanese language by the translator. However, the pilot studies also suggested some interesting features of fan translation, such as the treatment of cultural references. The two pilot studies indicate that it is desirable to conduct further studies with more data, in order to confirm the results of present studies, and to see the possible relationship between the types of trnalsation errors found in scanlation and the particular type of Japanese language (informal, conversational) that could be learned from manga.
Resumo:
Background qtl.outbred is an extendible interface in the statistical environment, R, for combining quantitative trait loci (QTL) mapping tools. It is built as an umbrella package that enables outbred genotype probabilities to be calculated and/or imported into the software package R/qtl. Findings Using qtl.outbred, the genotype probabilities from outbred line cross data can be calculated by interfacing with a new and efficient algorithm developed for analyzing arbitrarily large datasets (included in the package) or imported from other sources such as the web-based tool, GridQTL. Conclusion qtl.outbred will improve the speed for calculating probabilities and the ability to analyse large future datasets. This package enables the user to analyse outbred line cross data accurately, but with similar effort than inbred line cross data.
Resumo:
This thesis consists of a summary and four self-contained papers. Paper [I] Following the 1987 report by The World Commission on Environment and Development, the genuine saving has come to play a key role in the context of sustainable development, and the World Bank regularly publishes numbers for genuine saving on a national basis. However, these numbers are typically calculated as if the tax system is non-distortionary. This paper presents an analogue to genuine saving in a second best economy, where the government raises revenue by means of distortionary taxation. We show how the social cost of public debt, which depends on the marginal excess burden, ought to be reflected in the genuine saving. We also illustrate by presenting calculations for Greece, Japan, Portugal, U.K., U.S. and OECD average, showing that the numbers published by the World Bank are likely to be biased and may even give incorrect information as to whether the economy is locally sustainable. Paper [II] This paper examines the relationships among per capita CO2 emissions, per capita GDP and international trade based on panel data spanning the period 1960-2008 for 150 countries. A distinction is also made between OECD and Non-OECD countries to capture the differences of this relationship between developed and developing economies. We apply panel unit root and cointegration tests, and estimate a panel error correction model. The results from the error correction model suggest that there are long-term relationships between the variables for the whole sample and for Non-OECD countries. Finally, Granger causality tests show that there is bi-directional short-term causality between per capita GDP and international trade for the whole sample and between per capita GDP and CO2 emissions for OECD countries. Paper [III] Fundamental questions in economics are why some regions are richer than others, why their growth rates differ, whether their growth rates tend to converge, and what key factors contribute to explain economic growth. This paper deals with the average income growth, net migration, and changes in unemployment rates at the municipal level in Sweden. The aim is to explore in depth the effects of possible underlying determinants with a particular focus on local policy variables. The analysis is based on a three-equation model. Our results show, among other things, that increases in the local public expenditure and income taxe rate have negative effects on subsequent income income growth. In addition, the results show conditional convergence, i.e. that the average income among the municipal residents tends to grow more rapidly in relatively poor local jurisdictions than in initially “richer” jurisdictions, conditional on the other explanatory variables. Paper [IV] This paper explores the relationship between income growth and income inequality using data at the municipal level in Sweden for the period 1992-2007. We estimate a fixed effects panel data growth model, where the within-municipality income inequality is one of the explanatory variables. Different inequality measures (Gini coefficient, top income shares, and measures of inequality in the lower and upper part of the income distribution) are examined. We find a positive and significant relationship between income growth and income inequality measured as the Gini coefficient and top income shares, respectively. In addition, while inequality in the upper part of the income distribution is positively associated with the income growth rate, inequality in the lower part of the income distribution seems to be negatively related to the income growth. Our findings also suggest that increased income inequality enhances growth more in municipalities with a high level of average income than in municipalities with a low level of average income.
Resumo:
Background There is emerging evidence that the physical environment is important for health, quality of life and care, but there is a lack of valid instruments to assess health care environments. The Sheffield Care Environment Assessment Matrix (SCEAM), developed in the United Kingdom, provides a comprehensive assessment of the physical environment of residential care facilities for older people. This paper reports on the translation and adaptation of SCEAM for use in Swedish residential care facilities for older people, including information on its validity and reliability. Methods SCEAM was translated into Swedish and back-translated into English, and assessed for its relevance by experts using content validity index (CVI) together with qualitative data. After modification, the validity assessments were repeated and followed by test-retest and inter-rater reliability tests in six units within a Swedish residential care facility that varied in terms of their environmental characteristics. Results Translation and back translation identified linguistic and semantic related issues. The results of the first content validity analysis showed that more than one third of the items had item-CVI (I-CVI) values less than the critical value of 0.78. After modifying the instrument, the second content validation analysis resulted in I-CVI scores above 0.78, the suggested criteria for excellent content validity. Test-retest reliability showed high stability (96% and 95% for two independent raters respectively), and inter-rater reliability demonstrated high levels of agreement (95% and 94% on two separate rating occasions). Kappa values were very good for test-retest (κ= 0.903 and 0.869) and inter-rater reliability (κ= 0.851 and 0.832). Conclusions Adapting an instrument to a domestic context is a complex and time-consuming process, requiring an understanding of the culture where the instrument was developed and where it is to be used. A team, including the instrument’s developers, translators, and researchers is necessary to ensure a valid translation and adaption. This study showed preliminary validity and reliability evidence for the Swedish version (S-SCEAM) when used in a Swedish context. Further, we believe that the S-SCEAM has improved compared to the original instrument and suggest that it can be used as a foundation for future developments of the SCEAM model.
Resumo:
The open provenance architecture (OPA) approach to the challenge was distinct in several regards. In particular, it is based on an open, well-defined data model and architecture, allowing different components of the challenge workflow to independently record documentation, and for the workflow to be executed in any environment. Another noticeable feature is that we distinguish between the data recorded about what has occurred, emphprocess documentation, and the emphprovenance of a data item, which is all that caused the data item to be as it is and is obtained as the result of a query over process documentation. This distinction allows us to tailor the system to separately best address the requirements of recording and querying documentation. Other notable features include the explicit recording of causal relationships between both events and data items, an interaction-based world model, intensional definition of data items in queries rather than relying on explicit naming mechanisms, and emphstyling of documentation to support non-functional application requirements such as reducing storage costs or ensuring privacy of data. In this paper we describe how each of these features aid us in answering the challenge provenance queries.