53 resultados para On-site observations


Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the decade since OceanObs `99, great advances have been made in the field of ocean data dissemination. The use of Internet technologies has transformed the landscape: users can now find, evaluate and access data rapidly and securely using only a web browser. This paper describes the current state of the art in dissemination methods for ocean data, focussing particularly on ocean observations from in situ and remote sensing platforms. We discuss current efforts being made to improve the consistency of delivered data and to increase the potential for automated integration of diverse datasets. An important recent development is the adoption of open standards from the Geographic Information Systems community; we discuss the current impact of these new technologies and their future potential. We conclude that new approaches will indeed be necessary to exchange data more effectively and forge links between communities, but these approaches must be evaluated critically through practical tests, and existing ocean data exchange technologies must be used to their best advantage. Investment in key technology components, cross-community pilot projects and the enhancement of end-user software tools will be required in order to assess and demonstrate the value of any new technology.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ethnographic methodologies developed in social anthropology and sociology hold considerable promise for addressing practical, problem-based research concerned with the construction site. The extended researcher-engagement characteristic of ethnography reveals rich insights, yet is infrequently used to understand how workplace realities are lived out on construction sites. Moreover, studies that do employ these methods are rarely reported within construction research journals. This paper argues that recent innovations in ethnographic methodologies offer new routes to: posing questions; understanding workplace socialities (i.e. the qualities of the social relationships that develop on construction sites); learning about forms, uses and communication of knowledge on construction sites; and turning these into meaningful recommendations. This argument is supported by examples from an interdisciplinary ethnography concerning migrant workers and communications on UK construction sites. The presented research seeks to understand how construction workers communicate with managers and each other and how they stay safe on site, with the objective of informing site health-and-safety strategies and the production and evaluation of training and other materials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Herpes zoster is caused by the reactivation of varicella-zoster virus from sensory neurons. The commonest complication following zoster is chronic pain termed post herpetic neuralgia. OBJECTIVES: To investigate the dynamics of VZV viraemia and viral load following the resolution of zoster and its relationship to PHN development. STUDY DESIGN: Blood samples were collected at baseline, 1 month, 3 months and 6 month from a prospective study of 63 patients with active zoster. Quantification of VZV DNA in whole blood was performed using a real-time PCR assay. RESULTS: During acute zoster, all patients had detectable VZV DNA in their blood. VZV DNA remained detectable in the blood of 91% of patients at 6 months although levels declined significantly (p<0.0001). A history of prodromal symptoms (p=0.005) and severity of pain at baseline (p=0.038) as well as taking antivirals (p=0.046) and being immunocompromised (p=0.043) were associated, with longer time to recovery from PHN. Viral DNA loads were consistently higher in patients with risk factors for PHN and higher viral DNA loads over time were associated with longer time to recovery (p=0.058 overall and 0.038 in immunocompetent). CONCLUSIONS: Based on these observations we hypothesise that VZV replication persists following acute shingles and that higher viral DNA loads contribute to the risk factors for PHN.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Increased tidal levels and storm surges related to climate change are projected to result in extremely adverse effects on coastal regions. Predictions of such extreme and small-scale events, however, are exceedingly challenging, even for relatively short time horizons. Here we use data from observations, ERA-40 reanalysis, climate scenario simulations, and a simple feature model to find that the frequency of extreme storm surge events affecting Venice is projected to decrease by about 30% by the end of the twenty-first century. In addition, through a trend assessment based on tidal observations we found a reduction in extreme tidal levels. Extrapolating the current +17 cm/century sea level trend, our results suggest that the frequency of extreme tides in Venice might largely remain unaltered under the projected twenty-first century climate simulations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study, we systematically compare a wide range of observational and numerical precipitation datasets for Central Asia. Data considered include two re-analyses, three datasets based on direct observations, and the output of a regional climate model simulation driven by a global re-analysis. These are validated and intercompared with respect to their ability to represent the Central Asian precipitation climate. In each of the datasets, we consider the mean spatial distribution and the seasonal cycle of precipitation, the amplitude of interannual variability, the representation of individual yearly anomalies, the precipitation sensitivity (i.e. the response to wet and dry conditions), and the temporal homogeneity of precipitation. Additionally, we carried out part of these analyses for datasets available in real time. The mutual agreement between the observations is used as an indication of how far these data can be used for validating precipitation data from other sources. In particular, we show that the observations usually agree qualitatively on anomalies in individual years while it is not always possible to use them for the quantitative validation of the amplitude of interannual variability. The regional climate model is capable of improving the spatial distribution of precipitation. At the same time, it strongly underestimates summer precipitation and its variability, while interannual variations are well represented during the other seasons, in particular in the Central Asian mountains during winter and spring

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Almost all the electricity currently produced in the UK is generated as part of a centralised power system designed around large fossil fuel or nuclear power stations. This power system is robust and reliable but the efficiency of power generation is low, resulting in large quantities of waste heat. The principal aim of this paper is to investigate an alternative concept: the energy production by small scale generators in close proximity to the energy users, integrated into microgrids. Microgrids—de-centralised electricity generation combined with on-site production of heat—bear the promise of substantial environmental benefits, brought about by a higher energy efficiency and by facilitating the integration of renewable sources such as photovoltaic arrays or wind turbines. By virtue of good match between generation and load, microgrids have a low impact on the electricity network, despite a potentially significant level of generation by intermittent energy sources. The paper discusses the technical and economic issues associated with this novel concept, giving an overview of the generator technologies, the current regulatory framework in the UK, and the barriers that have to be overcome if microgrids are to make a major contribution to the UK energy supply. The focus of this study is a microgrid of domestic users powered by small Combined Heat and Power generators and photovoltaics. Focusing on the energy balance between the generation and load, it is found that the optimum combination of the generators in the microgrid- consisting of around 1.4 kWp PV array per household and 45% household ownership of micro-CHP generators- will maintain energy balance on a yearly basis if supplemented by energy storage of 2.7 kWh per household. We find that there is no fundamental technological reason why microgrids cannot contribute an appreciable part of the UK energy demand. Indeed, an estimate of cost indicates that the microgrids considered in this study would supply electricity at a cost comparable with the present electricity supply if the current support mechanisms for photovoltaics were maintained. Combining photovoltaics and micro-CHP and a small battery requirement gives a microgrid that is independent of the national electricity network. In the short term, this has particular benefits for remote communities but more wide-ranging possibilities open up in the medium to long term. Microgrids could meet the need to replace current generation nuclear and coal fired power stations, greatly reducing the demand on the transmission and distribution network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In numerical weather prediction (NWP) data assimilation (DA) methods are used to combine available observations with numerical model estimates. This is done by minimising measures of error on both observations and model estimates with more weight given to data that can be more trusted. For any DA method an estimate of the initial forecast error covariance matrix is required. For convective scale data assimilation, however, the properties of the error covariances are not well understood. An effective way to investigate covariance properties in the presence of convection is to use an ensemble-based method for which an estimate of the error covariance is readily available at each time step. In this work, we investigate the performance of the ensemble square root filter (EnSRF) in the presence of cloud growth applied to an idealised 1D convective column model of the atmosphere. We show that the EnSRF performs well in capturing cloud growth, but the ensemble does not cope well with discontinuities introduced into the system by parameterised rain. The state estimates lose accuracy, and more importantly the ensemble is unable to capture the spread (variance) of the estimates correctly. We also find, counter-intuitively, that by reducing the spatial frequency of observations and/or the accuracy of the observations, the ensemble is able to capture the states and their variability successfully across all regimes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accurate decadal climate predictions could be used to inform adaptation actions to a changing climate. The skill of such predictions from initialised dynamical global climate models (GCMs) may be assessed by comparing with predictions from statistical models which are based solely on historical observations. This paper presents two benchmark statistical models for predicting both the radiatively forced trend and internal variability of annual mean sea surface temperatures (SSTs) on a decadal timescale based on the gridded observation data set HadISST. For both statistical models, the trend related to radiative forcing is modelled using a linear regression of SST time series at each grid box on the time series of equivalent global mean atmospheric CO2 concentration. The residual internal variability is then modelled by (1) a first-order autoregressive model (AR1) and (2) a constructed analogue model (CA). From the verification of 46 retrospective forecasts with start years from 1960 to 2005, the correlation coefficient for anomaly forecasts using trend with AR1 is greater than 0.7 over parts of extra-tropical North Atlantic, the Indian Ocean and western Pacific. This is primarily related to the prediction of the forced trend. More importantly, both CA and AR1 give skillful predictions of the internal variability of SSTs in the subpolar gyre region over the far North Atlantic for lead time of 2 to 5 years, with correlation coefficients greater than 0.5. For the subpolar gyre and parts of the South Atlantic, CA is superior to AR1 for lead time of 6 to 9 years. These statistical forecasts are also compared with ensemble mean retrospective forecasts by DePreSys, an initialised GCM. DePreSys is found to outperform the statistical models over large parts of North Atlantic for lead times of 2 to 5 years and 6 to 9 years, however trend with AR1 is generally superior to DePreSys in the North Atlantic Current region, while trend with CA is superior to DePreSys in parts of South Atlantic for lead time of 6 to 9 years. These findings encourage further development of benchmark statistical decadal prediction models, and methods to combine different predictions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diffuse pollution, and the contribution from agriculture in particular, has become increasingly important as pollution from point sources has been addressed by wastewater treatment. Land management approaches, such as construction of field wetlands, provide one group of mitigation options available to farmers. Although field wetlands are widely used for diffuse pollution control in temperate environments worldwide, there is a shortage of evidence for the effectiveness and viability of these mitigation options in the UK. The Mitigation Options for Phosphorus and Sediment Project aims to make recommendations regarding the design and effectiveness of field wetlands for diffuse pollution control in UK landscapes. Ten wetlands have been built on four farms in Cumbria and Leicestershire. This paper focuses on sediment retention within the wetlands, estimated from annual sediment surveys in the first two years, and discusses establishment costs. It is clear that the wetlands are effective in trapping a substantial amount of sediment. Estimates of annual sediment retention suggest higher trapping rates at sandy sites (0.5–6 t ha�1 yr�1), compared to silty sites (0.02–0.4 t ha�1 yr�1) and clay sites (0.01–0.07 t ha�1 yr�1). Establishment costs for the wetlands ranged from £280 to £3100 and depended more on site specific factors, such as fencing and gateways on livestock farms, rather than on wetland size or design. Wetlands with lower trapping rates would also have lower maintenance costs, as dredging would be required less frequently. The results indicate that field wetlands show promise for inclusion in agri-environment schemes, particularly if capital payments can be provided for establishment, to encourage uptake of these multi-functional features.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data analysis based on station observations reveals that many meteorological variables averaged over the Tibetan Plateau (TP) are closely correlated, and their trends during the past decades are well correlated with the rainfall trend of the Asian summer monsoon. However, such correlation does not necessarily imply causality. Further diagnosis confirms the existence of a weakening trend in TP thermal forcing, characterized by weakened surface sensible heat flux in spring and summer during the past decades. This weakening trend is associated with decreasing summer precipitation over northern South Asia and North China and increasing precipitation over northwestern China, South China, and Korea. An atmospheric general circulation model, the HadAM3, is employed to elucidate the causality between the weakening TP forcing and the change in the Asian summer monsoon rainfall. Results demonstrate that a weakening in surface sensible heating over the TP results in reduced summer precipitation in the plateau region and a reduction in the associated latent heat release in summer. These changes in turn result in the weakening of the near-surface cyclonic circulation surrounding the plateau and the subtropical anticyclone over the subtropical western North Pacific, similar to the results obtained from the idealized TP experiment in Part I of this study. The southerly that normally dominates East Asia, ranging from the South China Sea to North China, weakens, resulting in a weaker equilibrated Sverdrup balance between positive vorticity generation and latent heat release. Consequently, the convergence of water vapor transport is confined to South China, forming a unique anomaly pattern in monsoon rainfall, the so-called “south wet and north dry.” Because the weakening trend in TP thermal forcing is associated with global warming, the present results provide an effective means for assessing projections of regional climate over Asia in the context of global warming.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The necessity and benefits for establishing the international Earth-system Prediction Initiative (EPI) are discussed by scientists associated with the World Meteorological Organization (WMO) World Weather Research Programme (WWRP), World Climate Research Programme (WCRP), International Geosphere–Biosphere Programme (IGBP), Global Climate Observing System (GCOS), and natural-hazards and socioeconomic communities. The proposed initiative will provide research and services to accelerate advances in weather, climate, and Earth system prediction and the use of this information by global societies. It will build upon the WMO, the Group on Earth Observations (GEO), the Global Earth Observation System of Systems (GEOSS) and the International Council for Science (ICSU) to coordinate the effort across the weather, climate, Earth system, natural-hazards, and socioeconomic disciplines. It will require (i) advanced high-performance computing facilities, supporting a worldwide network of research and operational modeling centers, and early warning systems; (ii) science, technology, and education projects to enhance knowledge, awareness, and utilization of weather, climate, environmental, and socioeconomic information; (iii) investments in maintaining existing and developing new observational capabilities; and (iv) infrastructure to transition achievements into operational products and services.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In today's global economic conditions, improving the productivity of the construction industry is becoming more pressing than ever. Several factors impact the efficiency of construction operatives, but motivation is among the most important. Since low productivity is one of the significant challenges facing the construction industry in the State of Kuwait, the objective of this case study is to identify, explore, and rank the relative importance of the factors perceived to impact the motivational level of master craftsmen involved in primary construction trades. To achieve this objective, a structured questionnaire survey comprising 23 factors, which were shortlisted based on relevant previous research on motivation, the input of local industry experts, and numerous interviews with skilled operatives, was distributed to a large number of master craftsmen. Using the “Relative Importance Index” technique, the following prominent factors are identified: (1) payment delay; (2) rework; (3) lack of a financial incentive scheme; (4) the extent of change orders during execution; (5) incompetent supervisors; (6) delays in responding to Requests For Information (RFI); (7) overcrowding and operatives interface; (8) unrealistic scheduling and performance expectation; (9) shortage of materials on site; and (10) drawings quality level. The findings can be used to provide industry practitioners with guidance for focusing, acting upon, and controlling the critical factors influencing the performance of master craftsmen, hence, assist in achieving an efficient utilization of the workforce, and a reasonable level of competitiveness and cost effective operation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alverata: a typeface design for Europe This typeface is a response to the extraordinarily diverse forms of letters of the Latin alphabet in manuscripts and inscriptions in the Romanesque period (c. 1000–1200). While the Romanesque did provide inspiration for architectural lettering in the nineteenth century, these letterforms have not until now been systematically considered and redrawn as a working typeface. The defining characteristic of the Romanesque letterform is variety: within an individual inscription or written text, letters such as A, C, E and G might appear with different forms at each appearance. Some of these forms relate to earlier Roman inscriptional forms and are therefore familiar to us, but others are highly geometric and resemble insular and uncial forms. The research underlying the typeface involved the collection of a large number of references for lettering of this period, from library research and direct on-site ivestigation. This investigation traced the wide dispersal of the Romanesque lettering tradition across the whole of Europe. The variety of letter widths and weights encountered, as well as variant shapes for individual letters, offered both direct models and stylistic inspiration for the characters and for the widths and weight variants of the typeface. The ability of the OpenType format to handle multiple stylistic variants of any one character has been exploited to reflect the multiplicity of forms available to stonecutters and scribes of the period. To make a typeface that functions in a contemporary environment, a lower case has been added, and formal and informal variants supported. The pan-European nature of the Romanesque design tradition has inspired an pan-European approach to the character set of the typeface, allowing for text composition in all European languages, and the typeface has been extended into Greek and Cyrillic, so that the broadest representation of European languages can be achieved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.