989 resultados para Climatic data simulation


Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research aims to use the multivariate geochemical dataset, generated by the Tellus project, to investigate the appropriate use of transformation methods to maintain the integrity of geochemical data and inherent constrained behaviour in multivariate relationships. The widely used normal score transform is compared with the use of a stepwise conditional transform technique. The Tellus Project, managed by GSNI and funded by the Department of Enterprise Trade and Development and the EU’s Building Sustainable Prosperity Fund, involves the most comprehensive geological mapping project ever undertaken in Northern Ireland. Previous study has demonstrated spatial variability in the Tellus data but geostatistical analysis and interpretation of the datasets requires use of an appropriate methodology that reproduces the inherently complex multivariate relations. Previous investigation of the Tellus geochemical data has included use of Gaussian-based techniques. However, earth science variables are rarely Gaussian, hence transformation of data is integral to the approach. The multivariate geochemical dataset generated by the Tellus project provides an opportunity to investigate the appropriate use of transformation methods, as required for Gaussian-based geostatistical analysis. In particular, the stepwise conditional transform is investigated and developed for the geochemical datasets obtained as part of the Tellus project. The transform is applied to four variables in a bivariate nested fashion due to the limited availability of data. Simulation of these transformed variables is then carried out, along with a corresponding back transformation to original units. Results show that the stepwise transform is successful in reproducing both univariate statistics and the complex bivariate relations exhibited by the data. Greater fidelity to multivariate relationships will improve uncertainty models, which are required for consequent geological, environmental and economic inferences.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Darwin`s climate is hot and humid and as a result the use of residential air-conditioners is high. Although this technology allows the occupant to achieve thermal comfort, its use contributes directly to an increase in the emission of greenhouse gases. More environmentally-friendly ways of achieving residential thermal comfort in this climate need to be investigated. One method is to improve the home`s passive design. The aim of this research was to increase the thermal comfort of typical Darwin homes without the use of air conditioning. Temperature data from two houses (lightweight elevated and concrete) was recorded over a nine-day period and used to validate a TRNSYS simulation model of each house. Simulations were run using these validated models and three months of climatic data (January—March) to evaluate various passive design strategies. The success of three strategies was analysed using PMV and PPD indicators. As a single strategy, it was found that ventilation and air velocity by far increased the level of thermal comfort for occupants of both houses. Although the passive design strategies of increased shading and insulation were beneficial, Darwin`s ovemight low temperature and humidity are still too high to reduce these levels within the house significantly without air conditioning.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Light-frame wood buildings are widely built in the United States (U.S.). Natural hazards cause huge losses to light-frame wood construction. This study proposes methodologies and a framework to evaluate the performance and risk of light-frame wood construction. Performance-based engineering (PBE) aims to ensure that a building achieves the desired performance objectives when subjected to hazard loads. In this study, the collapse risk of a typical one-story light-frame wood building is determined using the Incremental Dynamic Analysis method. The collapse risks of buildings at four sites in the Eastern, Western, and Central regions of U.S. are evaluated. Various sources of uncertainties are considered in the collapse risk assessment so that the influence of uncertainties on the collapse risk of lightframe wood construction is evaluated. The collapse risks of the same building subjected to maximum considered earthquakes at different seismic zones are found to be non-uniform. In certain areas in the U.S., the snow accumulation is significant and causes huge economic losses and threatens life safety. Limited study has been performed to investigate the snow hazard when combined with a seismic hazard. A Filtered Poisson Process (FPP) model is developed in this study, overcoming the shortcomings of the typically used Bernoulli model. The FPP model is validated by comparing the simulation results to weather records obtained from the National Climatic Data Center. The FPP model is applied in the proposed framework to assess the risk of a light-frame wood building subjected to combined snow and earthquake loads. The snow accumulation has a significant influence on the seismic losses of the building. The Bernoulli snow model underestimates the seismic loss of buildings in areas with snow accumulation. An object-oriented framework is proposed in this study to performrisk assessment for lightframe wood construction. For home owners and stake holders, risks in terms of economic losses is much easier to understand than engineering parameters (e.g., inter story drift). The proposed framework is used in two applications. One is to assess the loss of the building subjected to mainshock-aftershock sequences. Aftershock and downtime costs are found to be important factors in the assessment of seismic losses. The framework is also applied to a wood building in the state of Washington to assess the loss of the building subjected to combined earthquake and snow loads. The proposed framework is proven to be an appropriate tool for risk assessment of buildings subjected to multiple hazards. Limitations and future works are also identified.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

One-dimensional dynamic computer simulation was employed to investigate the separation and migration order change of ketoconazole enantiomers at low pH in presence of increasing amounts of (2-hydroxypropyl)-β-cyclodextrin (OHP-β-CD). The 1:1 interaction of ketoconazole with the neutral cyclodextrin was simulated under real experimental conditions and by varying input parameters for complex mobilities and complexation constants. Simulation results obtained with experimentally determined apparent ionic mobilities, complex mobilities, and complexation constants were found to compare well with the calculated separation selectivity and experimental data. Simulation data revealed that the migration order of the ketoconazole enantiomers at low (OHP-β-CD) concentrations (i.e. below migration order inversion) is essentially determined by the difference in complexation constants and at high (OHP-β-CD) concentrations (i.e. above migration order inversion) by the difference in complex mobilities. Furthermore, simulations with complex mobilities set to zero provided data that mimic migration order and separation with the chiral selector being immobilized. For the studied CEC configuration, no migration order inversion is predicted and separations are shown to be quicker and electrophoretic transport reduced in comparison to migration in free solution. The presented data illustrate that dynamic computer simulation is a valuable tool to study electrokinetic migration and separations of enantiomers in presence of a complexing agent.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of hindcast climatic data is quite extended for multiple applications. However, this approach needs the support of a validation process to allow its drawbacks and, therefore, confidence levels to be assessed. In this work, the strategy relies on an hourly wind database resulting from a dynamical downscaling experiment, with a spatial resolution of 10 km, covering the Iberian Peninsula (IP), driven by the ERA40 reanalysis (1959–2001) extended by European Centre for Medium-Range Weather Forecast (ECMWF) analysis (2002–2007) and comprising two main steps. Initially, the skill of the simulation is evaluated comparing the quality-tested observational database (Lorente-Plazas et al., 2014) at local and regional scales. The results show that the model is able to portray the main features of the wind over the IP: annual cycles, wind roses, spatial and temporal variability, as well as the response to different circulation types. In addition, there is a significant added value of the simulation with respect to driving conditions, especially in regions with a complex orography. However, some problems are evident, the major drawback being the systematic overestimation of the wind speed, which is mainly attributed to a missrepresentation of frictional forces. The model skill is also lower along the Mediterranean coast and for the Pyrenees. In a second phase, the high spatio-temporal resolution of the pseudo-real wind database is used to explore the limitations of the observational database. It is shown that missing values do not affect the characterisation of the wind climate over the IP, while the length of the observational period (6 years) is sufficient for most regions, with only a few exceptions. The spatial distribution of the observational sampling schemes should be enhanced to improve the correct assessment of all IP wind regimes, particularly in some mountainous areas.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

El objeto de la Tesis es el régimen de humedad de los suelos de la España Peninsular, cuya determinación a partir de datos climáticos se ha realizado de acuerdo con la metodología incluida en la taxonomía norteamericana de suelos (Soil Survey StafF 1975, 1994). Esta metodología presenta algunas indefiniciones, que se pretenden solventar. La investigación ha consistido en la clasificación de los regímenes de humedad del suelo de la España Peninsular y su representación cartográfica. Se han considerado varios métodos de determinación de la evapotranspiración y varios modelos de estimación del régimen de humedad. La clasificación numérica de los regímenes de 467 localidades ha permitido su agrupamiento en clases y su subdivisión natural. El contraste de esta información con la aportada por la cartografía de series de vegetación, mediante un sistema de información geográfica tipo reticular, ha servido para afinar los mapas. El resultado revela que un modelo modificado sirve para subsanar las indefiniciones y posibilita la adaptación de los grupos a las condiciones naturales. SUMMARY The soil moisture regime defined by the Soil Taxonomy (Soil Survey StafF, 1975, 1994) has been determined by Newhall's simulation model from climatic data. This classification presents some diffículties as gaps and overlaps in the definitions, that we have tried to solve. The soil moisture regimes have been determined by different methods and the results have been classified and mapped. We have compared differents methods of evapotranspiration estimation. A simple modification of Newhall's model matchs better the natural conditions of Spain when comparing with the potential vegatation. A ráster geographical information system has been used to overlay the information layers. As result of the numerical classification of soil moistures regimes of 467 sites, the regimes have been grouped in classes adapted to the natural conditions of Spain. We have compared the results with the potential vegetation map in order to tune the soil moisture regime boundaries. We propose a new soil moisture regimes classification divided in two categories. This classification is adapted to Spanish natural conditions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the last few years there has been a heightened interest in data treatment and analysis with the aim of discovering hidden knowledge and eliciting relationships and patterns within this data. Data mining techniques (also known as Knowledge Discovery in Databases) have been applied over a wide range of fields such as marketing, investment, fraud detection, manufacturing, telecommunications and health. In this study, well-known data mining techniques such as artificial neural networks (ANN), genetic programming (GP), forward selection linear regression (LR) and k-means clustering techniques, are proposed to the health and sports community in order to aid with resistance training prescription. Appropriate resistance training prescription is effective for developing fitness, health and for enhancing general quality of life. Resistance exercise intensity is commonly prescribed as a percent of the one repetition maximum. 1RM, dynamic muscular strength, one repetition maximum or one execution maximum, is operationally defined as the heaviest load that can be moved over a specific range of motion, one time and with correct performance. The safety of the 1RM assessment has been questioned as such an enormous effort may lead to muscular injury. Prediction equations could help to tackle the problem of predicting the 1RM from submaximal loads, in order to avoid or at least, reduce the associated risks. We built different models from data on 30 men who performed up to 5 sets to exhaustion at different percentages of the 1RM in the bench press action, until reaching their actual 1RM. Also, a comparison of different existing prediction equations is carried out. The LR model seems to outperform the ANN and GP models for the 1RM prediction in the range between 1 and 10 repetitions. At 75% of the 1RM some subjects (n = 5) could perform 13 repetitions with proper technique in the bench press action, whilst other subjects (n = 20) performed statistically significant (p < 0:05) more repetitions at 70% than at 75% of their actual 1RM in the bench press action. Rate of perceived exertion (RPE) seems not to be a good predictor for 1RM when all the sets are performed until exhaustion, as no significant differences (p < 0:05) were found in the RPE at 75%, 80% and 90% of the 1RM. Also, years of experience and weekly hours of strength training are better correlated to 1RM (p < 0:05) than body weight. O'Connor et al. 1RM prediction equation seems to arise from the data gathered and seems to be the most accurate 1RM prediction equation from those proposed in literature and used in this study. Epley's 1RM prediction equation is reproduced by means of data simulation from 1RM literature equations. Finally, future lines of research are proposed related to the problem of the 1RM prediction by means of genetic algorithms, neural networks and clustering techniques. RESUMEN En los últimos años ha habido un creciente interés en el tratamiento y análisis de datos con el propósito de descubrir relaciones, patrones y conocimiento oculto en los mismos. Las técnicas de data mining (también llamadas de \Descubrimiento de conocimiento en bases de datos\) se han aplicado consistentemente a lo gran de un gran espectro de áreas como el marketing, inversiones, detección de fraude, producción industrial, telecomunicaciones y salud. En este estudio, técnicas bien conocidas de data mining como las redes neuronales artificiales (ANN), programación genética (GP), regresión lineal con selección hacia adelante (LR) y la técnica de clustering k-means, se proponen a la comunidad del deporte y la salud con el objetivo de ayudar con la prescripción del entrenamiento de fuerza. Una apropiada prescripción de entrenamiento de fuerza es efectiva no solo para mejorar el estado de forma general, sino para mejorar la salud e incrementar la calidad de vida. La intensidad en un ejercicio de fuerza se prescribe generalmente como un porcentaje de la repetición máxima. 1RM, fuerza muscular dinámica, una repetición máxima o una ejecución máxima, se define operacionalmente como la carga máxima que puede ser movida en un rango de movimiento específico, una vez y con una técnica correcta. La seguridad de las pruebas de 1RM ha sido cuestionada debido a que el gran esfuerzo requerido para llevarlas a cabo puede derivar en serias lesiones musculares. Las ecuaciones predictivas pueden ayudar a atajar el problema de la predicción de la 1RM con cargas sub-máximas y son empleadas con el propósito de eliminar o al menos, reducir los riesgos asociados. En este estudio, se construyeron distintos modelos a partir de los datos recogidos de 30 hombres que realizaron hasta 5 series al fallo en el ejercicio press de banca a distintos porcentajes de la 1RM, hasta llegar a su 1RM real. También se muestra una comparación de algunas de las distintas ecuaciones de predicción propuestas con anterioridad. El modelo LR parece superar a los modelos ANN y GP para la predicción de la 1RM entre 1 y 10 repeticiones. Al 75% de la 1RM algunos sujetos (n = 5) pudieron realizar 13 repeticiones con una técnica apropiada en el ejercicio press de banca, mientras que otros (n = 20) realizaron significativamente (p < 0:05) más repeticiones al 70% que al 75% de su 1RM en el press de banca. El ínndice de esfuerzo percibido (RPE) parece no ser un buen predictor del 1RM cuando todas las series se realizan al fallo, puesto que no existen diferencias signifiativas (p < 0:05) en el RPE al 75%, 80% y el 90% de la 1RM. Además, los años de experiencia y las horas semanales dedicadas al entrenamiento de fuerza están más correlacionadas con la 1RM (p < 0:05) que el peso corporal. La ecuación de O'Connor et al. parece surgir de los datos recogidos y parece ser la ecuación de predicción de 1RM más precisa de aquellas propuestas en la literatura y empleadas en este estudio. La ecuación de predicción de la 1RM de Epley es reproducida mediante simulación de datos a partir de algunas ecuaciones de predicción de la 1RM propuestas con anterioridad. Finalmente, se proponen futuras líneas de investigación relacionadas con el problema de la predicción de la 1RM mediante algoritmos genéticos, redes neuronales y técnicas de clustering.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Vernacular architecture has demonstrated its perfect environmental adaptation through its empirical development and improvement by generations of user-builders. Nowadays, the sustainability of vernacular architecture is the aim of some research projects in which the same method should be applied in order to be comparable. Hence, we propose a research method putting together various steps. Through the analysis of geographical, lithology, economic, cultural and social influence as well as materials and constructive systems, vernacular architecture is analyzed. But, all this information is put together with the natural landscape (topography and vegetation) and the climatic data (temperature, winds, rain and sun exposure). In addition, the use of bioclimatic charts, such as Olgyay or Givoni’s, revealed the necessities and strategies in urban and building design. They are satisfied in the vernacular architecture by the application of different energy conservation mechanisms, some of them are shown by different examples in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Collection of the monthly climatological reports of the United States by state or region, with monthly and annual national summaries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Twenty-eight microfiches (11 x 15 cm.) in pocket mounted on cover p. [3]. Header title: Historical climate network--temperature and precipitation data plots.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

No. 13 called: Climatological data, annual summary. Minnesota.