914 resultados para Semi-Empirical Methods


Relevância:

90.00% 90.00%

Publicador:

Resumo:

El rebase se define como el transporte de una cantidad importante de agua sobre la coronación de una estructura. Por tanto, es el fenómeno que, en general, determina la cota de coronación del dique dependiendo de la cantidad aceptable del mismo, a la vista de condicionantes funcionales y estructurales del dique. En general, la cantidad de rebase que puede tolerar un dique de abrigo desde el punto de vista de su integridad estructural es muy superior a la cantidad permisible desde el punto de vista de su funcionalidad. Por otro lado, el diseño de un dique con una probabilidad de rebase demasiado baja o nula conduciría a diseños incompatibles con consideraciones de otro tipo, como son las estéticas o las económicas. Existen distintas formas de estudiar el rebase producido por el oleaje sobre los espaldones de las obras marítimas. Las más habituales son los ensayos en modelo físico y las formulaciones empíricas o semi-empíricas. Las menos habituales son la instrumentación en prototipo, las redes neuronales y los modelos numéricos. Los ensayos en modelo físico son la herramienta más precisa y fiable para el estudio específico de cada caso, debido a la complejidad del proceso de rebase, con multitud de fenómenos físicos y parámetros involucrados. Los modelos físicos permiten conocer el comportamiento hidráulico y estructural del dique, identificando posibles fallos en el proyecto antes de su ejecución, evaluando diversas alternativas y todo esto con el consiguiente ahorro en costes de construcción mediante la aportación de mejoras al diseño inicial de la estructura. Sin embargo, presentan algunos inconvenientes derivados de los márgenes de error asociados a los ”efectos de escala y de modelo”. Las formulaciones empíricas o semi-empíricas presentan el inconveniente de que su uso está limitado por la aplicabilidad de las fórmulas, ya que éstas sólo son válidas para una casuística de condiciones ambientales y tipologías estructurales limitadas al rango de lo reproducido en los ensayos. El objetivo de la presente Tesis Doctoral es el contrate de las formulaciones desarrolladas por diferentes autores en materia de rebase en distintas tipologías de diques de abrigo. Para ello, se ha realizado en primer lugar la recopilación y el análisis de las formulaciones existentes para estimar la tasa de rebase sobre diques en talud y verticales. Posteriormente, se llevó a cabo el contraste de dichas formulaciones con los resultados obtenidos en una serie de ensayos realizados en el Centro de Estudios de Puertos y Costas. Para finalizar, se aplicó a los ensayos de diques en talud seleccionados la herramienta neuronal NN-OVERTOPPING2, desarrollada en el proyecto europeo de rebases CLASH (“Crest Level Assessment of Coastal Structures by Full Scale Monitoring, Neural Network Prediction and Hazard Analysis on Permissible Wave Overtopping”), contrastando de este modo la tasa de rebase obtenida en los ensayos con este otro método basado en la teoría de las redes neuronales. Posteriormente, se analizó la influencia del viento en el rebase. Para ello se han realizado una serie de ensayos en modelo físico a escala reducida, generando oleaje con y sin viento, sobre la sección vertical del Dique de Levante de Málaga. Finalmente, se presenta el análisis crítico del contraste de cada una de las formulaciones aplicadas a los ensayos seleccionados, que conduce a las conclusiones obtenidas en la presente Tesis Doctoral. Overtopping is defined as the volume of water surpassing the crest of a breakwater and reaching the sheltered area. This phenomenon determines the breakwater’s crest level, depending on the volume of water admissible at the rear because of the sheltered area’s functional and structural conditioning factors. The ways to assess overtopping processes range from those deemed to be most traditional, such as semi-empirical or empirical type equations and physical, reduced scale model tests, to others less usual such as the instrumentation of actual breakwaters (prototypes), artificial neural networks and numerical models. Determining overtopping in reduced scale physical model tests is simple but the values obtained are affected to a greater or lesser degree by the effects of a scale model-prototype such that it can only be considered as an approximation to what actually happens. Nevertheless, physical models are considered to be highly useful for estimating damage that may occur in the area sheltered by the breakwater. Therefore, although physical models present certain problems fundamentally deriving from scale effects, they are still the most accurate, reliable tool for the specific study of each case, especially when large sized models are adopted and wind is generated Empirical expressions obtained from laboratory tests have been developed for calculating the overtopping rate and, therefore, the formulas obtained obviously depend not only on environmental conditions – wave height, wave period and water level – but also on the model’s characteristics and are only applicable in a range of validity of the tests performed in each case. The purpose of this Thesis is to make a comparative analysis of methods for calculating overtopping rates developed by different authors for harbour breakwater overtopping. First, existing equations were compiled and analysed in order to estimate the overtopping rate on sloping and vertical breakwaters. These equations were then compared with the results obtained in a number of tests performed in the Centre for Port and Coastal Studies of the CEDEX. In addition, a neural network model developed in the European CLASH Project (“Crest Level Assessment of Coastal Structures by Full Scale Monitoring, Neural Network Prediction and Hazard Analysis on Permissible Wave Overtopping“) was also tested. Finally, the wind effects on overtopping are evaluated using tests performed with and without wind in the physical model of the Levante Breakwater (Málaga).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Em geral, uma embarcação de planeio é projetada para atingir elevados níveis de velocidade. Esse atributo de desempenho está diretamente relacionado ao porte da embarcação e à potência instalada em sua planta propulsiva. Tradicionalmente, durante o projeto de uma embarcação, as análises de desempenho são realizadas através de resultados de embarcações já existentes, retirados de séries sistemáticas ou de embarcações já desenvolvidas pelo estaleiro e/ou projetista. Além disso, a determinação dos atributos de desempenho pode ser feita através de métodos empíricos e/ou estatísticos, onde a embarcação é representada através de seus parâmetros geométricos principais; ou a partir de testes em modelos em escala reduzida ou protótipos. No caso específico de embarcações de planeio, o custo dos testes em escala reduzida é muito elevado em relação ao custo de projeto. Isso faz com que a maioria dos projetistas não opte por ensaios experimentais das novas embarcações em desenvolvimento. Ao longo dos últimos anos, o método de Savitsky foi largamente utilizado para se realizar estimativas de potência instalada de uma embarcação de planeio. Esse método utiliza um conjunto de equações semi-empíricas para determinar os esforços atuantes na embarcação, a partir dos quais é possível determinar a posição de equilíbrio de operação e a força propulsora necessária para navegar em uma dada velocidade. O método de Savitsky é muito utilizado nas fases iniciais de projeto, onde a geometria do casco ainda não foi totalmente definida, pois utiliza apenas as características geométricas principais da embarcação para realização das estimativas de esforços. À medida que se avança nas etapas de projeto, aumenta o detalhamento necessário das estimativas de desempenho. Para a realização, por exemplo, do projeto estrutural é necessária uma estimativa do campo de pressão atuante no fundo do casco, o qual não pode ser determinado pelo método de Savitsky. O método computacional implementado nesta dissertação, tem o objetivo de determinar as características do escoamento e o campo de pressão atuante no casco de uma embarcação de planeio navegando em águas calmas. O escoamento é determinado através de um problema de valor de contorno, no qual a superfície molhada no casco é considerada um corpo esbelto. Devido ao uso da teoria de corpo esbelto o problema pode ser tratado, separadamente, em cada seção, onde as condições de contorno são forçadamente respeitadas através de uma distribuição de vórtices.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Free Core Nutation (FCN) is a free mode of the Earth's rotation caused by the different material characteristics of the Earth's core and mantle. This causes the rotational axes of those layers to slightly diverge from each other, resulting in a wobble of the Earth's rotation axis comparable to nutations. In this paper we focus on estimating empirical FCN models using the observed nutations derived from the VLBI sessions between 1993 and 2013. Assuming a fixed value for the oscillation period, the time-variable amplitudes and phases are estimated by means of multiple sliding window analyses. The effects of using different a priori Earth Rotation Parameters (ERP) in the derivation of models are also addressed. The optimal choice of the fundamental parameters of the model, namely the window width and step-size of its shift, is searched by performing a thorough experimental analysis using real data. The former analyses lead to the derivation of a model with a temporal resolution higher than the one used in the models currently available, with a sliding window reduced to 400 days and a day-by-day shift. It is shown that this new model increases the accuracy of the modeling of the observed Earth's rotation. Besides, empirical models determined from USNO Finals as a priori ERP present a slightly lower Weighted Root Mean Square (WRMS) of residuals than IERS 08 C04 along the whole period of VLBI observations, according to our computations. The model is also validated through comparisons with other recognized models. The level of agreement among them is satisfactory. Let us remark that our estimates give rise to the lowest residuals and seem to reproduce the FCN signal in more detail.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose – The purpose of the paper was to conduct an empirical investigation to explore the impact of project management maturity models (PMMMs) on improving project performance. Design/methodology/approach – The investigation used a cross-case analysis involving over 90 individuals in seven organisations. Findings – The findings of the empirical investigation indicate that PMMMs demonstrate very high levels of variability in individual's assessment of project management maturity. Furthermore, at higher levels of maturity, the type of performance improvement adopted following their application is related to the type of PMMM used in the assessment. The paradox of the unreliability of PMMMs and their widespread acceptance is resolved by calling upon the “wisdom of crowds” phenomenon which has implications for the use of maturity model assessments in other arena. Research limitations/implications – The investigation does have the usual issues associated with case research, but the steps that have been taken in the cross-case construction and analysis have improved the overall robustness and extendibility of the findings. Practical implications – The tendency for PMMMs to shape improvements based on their own inherent structure needs further understanding. Originality/value – The use of empirical methods to investigate the link between project maturity models and extant changes in project management performance is highly novel and the findings that result from this have added resonance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper presents the main achievements of the author’s PhD dissertation. The work is dedicated to mathematical and semi-empirical approaches applied to the case of Bulgarian wildland fires. After the introductory explanations, short information from every chapter is extracted to cover the main parts of the obtained results. The methods used are described in brief and main outcomes are listed. ACM Computing Classification System (1998): D.1.3, D.2.0, K.5.1.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A sávosan rögzített devizaárfolyamok elméleti és gyakorlati vizsgálatai a nemzetközi közgazdaságtan egyik legnépszerűbb témaköre volt a kilencvenes évek elején. A gyakorlati módszerek közül az alkalmazások és hivatkozások száma tekintetében az úgynevezett eltolódással igazítás módszere emelkedett ki. A módszert alkalmazó szerzők szerint amíg a lebegő árfolyamú devizák előrejelzése céltalan feladatnak tűnik, addig sávos árfolyam esetén az árfolyam sávon belüli helyzetének előrejelzése sikeresen végezhető. E tanulmány bemutatja, hogy az Európai Monetáris Rendszer és az északeurópai államok sávos árfolyamrendszereinél e módszer alkalmazásával adódott eredmények például a lebegő árfolyamú amerikai dollárra és az egységgyökfolyamatok többségére is érvényesek. A tanulmány feltárja e látszólagos ellentmondás okait, és bemutat egy olyan, a sávos árfolyamrendszerek főbb megfigyelt jellemzőire épülő modellt, amelynek keretei között a sávon belüli árfolyam előrejelzése nem feltétlenül lehetséges, mert a leértékelés előtti időszakban a sávon belüli árfolyam alakulása kaotikus lehet. / === / Following the development of the first exchange rate target zone model at the end of the eighties dozens of papers analyzed theoretical and empirical topics of currency bands. This paper reviews different empirical methods to analyze the credibility of the band and lays special emphasis on the most widely used method, the so-called drift-adjustment method. Papers applying that method claim that while forecasting a freely floating currency is hopeless, predicting an exchange rate within the future band is successful. This paper shows that the results achieved by applications to EMS and Nordic currencies are not specific to data of target zone currencies. For example, application to US dollar and even to most unit root processes leads qualitatively to the same. This paper explores the solutions of this puzzle and shows a model of target zones in which the exchange rate within the band is not necessarily predictable since the process might follow chaotic dynamics before devaluation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The piles are one of the most important types of solution adopted for the foundation of buildings. They are responsible for transmitting to the soil in deepe r and resistant layers loads from structures. The interaction of the foundation element with the soil is a very important variable, making indispensable your domain in order to determine the strength of the assembly and establish design criteria for each c ase of application of the pile. In this research analyzes were performed f rom experiments load tests for precast concrete piles and inve stigations of soil of type SPT, a study was performed for obtaining the ultimate load capacity of the foundation through methods extrapolation of load - settlement curve , semi - empirical and theoretic . After that, were realized comparisons between the different methods used for two types of soil a granular behavior and other cohesive. For obtaining soil paramet ers to be used i n the methods were established empirical correlations with the standard penetration number (NSPT). The charge - settlement curves of the piles are also analyzed. In the face of established comparisons was indicated the most reliable semiempirical method Déco urt - Quaresma as the most reliable for estimating the tensile strength for granular and cohesive soils. Meanwhile, among the methods studied extrapolation is recommended method of Van der Veen as the most appropriate for predicting the tensile strength.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O presente trabalho, no âmbito de projeto final de curso de metrado em Engenharia da Construção, teve como objetivo o estudo do comportamento de estruturas de suporte de terras flexíveis multi-apoiadas (com diferentes tipos de apoio) para dois tipos solos homogéneos. Recorreu-se às teorias clássicas, como a de Rankine, desenvolvidas para estruturas de suporte de terras rígidas. Às teorias semi-empíricas de Terzaghi & Peck que culminaram nos diagramas de Terzaghi & Peck. Apesar de os digramas de Terzaghi & Peck serem diagramas de pressões de terras a usar em estruturas de suporte de terras flexíveis, apresentam algumas limitações importantes, como a sua aplicação apenas em solos heterogéneos, com presença ou não de níveis freáticos, e sem fornecer distribuição das pressões de terras na zona passiva (zona enterrada). Como na atualidade os modelos de elementos finitos permitem simular de modo muito mais rigoroso os problemas da engenharia. O presente trabalho esteve focado em analisar um caso prático em diferentes solos e com diferentes tipos de apoios. Será estudado mediante os métodos analíticos usando as teorias clássicas e posteriormente métodos numéricos (com diferentes programas de cálculo). Finalmente serão comparados os resultados obtidos mediante os diferentes métodos usados. As estruturas foram inicialmente pré-dimensionadas usando os métodos clássicos. Assim foram usados os diagramas de pressões de terras de Terzaghi & Peck para a zona ativa (zona em escavação) e a teoria de Rankine para conhecer as pressões de terras na zona enterrada da cortina (parede moldada) e recorrendo ao software Ftool para a obtenção dos parâmetros de dimensionamento de estruturas de suporte de terras objeto de estudo. Posteriormente utilizaram-se os programas de cálculo automático CYPE 2015 k, e o programa de cálculo de elementos finitos PLAXIS Introductory 2010. Estes programas permitem simular o faseamento construtivo do muro. Para estudar a influência de algúns parâmetros no comportamento da Resumo IV cortina o estudo foi realizado com dois solos distintos, um solo argiloso mole e um solo arenoso denso. Assim como para dois tipos de apoios distintos, ancoragens ativas e escoras passivas. Foram analisados diferentes parâmetros na estrutura de suporte; pressões horizontais das terras, deslocamentos horizontais, esforço axial, transverso e momento fletor.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The information on climate variations is essential for the research of many subjects, such as the performance of buildings and agricultural production. However, recorded meteorological data are often incomplete. There may be a limited number of locations recorded, while the number of recorded climatic variables and the time intervals can also be inadequate. Therefore, the hourly data of key weather parameters as required by many building simulation programmes are typically not readily available. To overcome this gap in measured information, several empirical methods and weather data generators have been developed. They generally employ statistical analysis techniques to model the variations of individual climatic variables, while the possible interactions between different weather parameters are largely ignored. Based on a statistical analysis of 10 years historical hourly climatic data over all capital cities in Australia, this paper reports on the finding of strong correlations between several specific weather variables. It is found that there are strong linear correlations between the hourly variations of global solar irradiation (GSI) and dry bulb temperature (DBT), and between the hourly variations of DBT and relative humidity (RH). With an increase in GSI, DBT would generally increase, while the RH tends to decrease. However, no such a clear correlation can be found between the DBT and atmospheric pressure (P), and between the DBT and wind speed. These findings will be useful for the research and practice in building performance simulation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experiments were undertaken to study drying kinetics of moist cylindrical shaped food particulates during fluidised bed drying. Cylindrical particles were prepared from Green beans with three different length:diameter ratios, 3:1, 2:1 and 1:1. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To summarise the extent to which narrative text fields in administrative health data are used to gather information about the event resulting in presentation to a health care provider for treatment of an injury, and to highlight best practise approaches to conducting narrative text interrogation for injury surveillance purposes.----- Design: Systematic review----- Data sources: Electronic databases searched included CINAHL, Google Scholar, Medline, Proquest, PubMed and PubMed Central.. Snowballing strategies were employed by searching the bibliographies of retrieved references to identify relevant associated articles.----- Selection criteria: Papers were selected if the study used a health-related database and if the study objectives were to a) use text field to identify injury cases or use text fields to extract additional information on injury circumstances not available from coded data or b) use text fields to assess accuracy of coded data fields for injury-related cases or c) describe methods/approaches for extracting injury information from text fields.----- Methods: The papers identified through the search were independently screened by two authors for inclusion, resulting in 41 papers selected for review. Due to heterogeneity between studies metaanalysis was not performed.----- Results: The majority of papers reviewed focused on describing injury epidemiology trends using coded data and text fields to supplement coded data (28 papers), with these studies demonstrating the value of text data for providing more specific information beyond what had been coded to enable case selection or provide circumstantial information. Caveats were expressed in terms of the consistency and completeness of recording of text information resulting in underestimates when using these data. Four coding validation papers were reviewed with these studies showing the utility of text data for validating and checking the accuracy of coded data. Seven studies (9 papers) described methods for interrogating injury text fields for systematic extraction of information, with a combination of manual and semi-automated methods used to refine and develop algorithms for extraction and classification of coded data from text. Quality assurance approaches to assessing the robustness of the methods for extracting text data was only discussed in 8 of the epidemiology papers, and 1 of the coding validation papers. All of the text interrogation methodology papers described systematic approaches to ensuring the quality of the approach.----- Conclusions: Manual review and coding approaches, text search methods, and statistical tools have been utilised to extract data from narrative text and translate it into useable, detailed injury event information. These techniques can and have been applied to administrative datasets to identify specific injury types and add value to previously coded injury datasets. Only a few studies thoroughly described the methods which were used for text mining and less than half of the studies which were reviewed used/described quality assurance methods for ensuring the robustness of the approach. New techniques utilising semi-automated computerised approaches and Bayesian/clustering statistical methods offer the potential to further develop and standardise the analysis of narrative text for injury surveillance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experiments were undertaken to study drying kinetics of different shaped moist food particulates during heat pump assisted fluidised bed drying. Three particular geometrical shapes of parallelepiped, cylindrical and spheres were selected from potatoes (aspect ratio = 1:1, 2:1, 3:1), cut beans (length: diameter = 1:1, 2:1, 3:1) and peas respectively. A batch fluidised bed dryer connected to a heat pump system was used for the experimentation. A Heat pump and fluid bed combination was used to increase overall energy efficiency and achieve higher drying rates. Drying kinetics, were evaluated with non-dimensional moisture at three different drying temperatures of 30, 40 and 50o C. Due to complex hydrodynamics of the fluidised beds, drying kinetics are dryer or material specific. Numerous mathematical models can be used to calculate drying kinetics ranging from analytical models with simplified assumptions to empirical models built by regression using experimental data. Empirical models are commonly used for various food materials due to their simpler approach. However problems in accuracy, limits the applications of empirical models. Some limitations of empirical models could be reduced by using semi-empirical models based on heat and mass transfer of the drying operation. One such method is the quasi-stationary approach. In this study, a modified quasi-stationary approach was used to model drying kinetics of the cylindrical food particles at three drying temperatures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Experimentally observed optical and photoelectrical spectra of nitrogen-contaminated (unintentionally doped) nano-crystalline CVD diamond films are simulated using semi-empirical adiabatic General Skettrup Model (GSM), which presumes dominant contributions of defect states from sp 3-coordinated intra-granular carbon atoms to intra-band single electron spectrum N(E) of the material. This picture disagrees with a common viewpoint that the N(E) spectrum of the gap states in diamond powders and polycrystalline CVD films mainly originates from π and π* bonds of sp2-coordinated carbon atoms, which are distributed nearly uniformly over outer surfaces and/or interfaces of the diamond grains. The GSM predicts as well strong effect of granular morphology on the density of intra-band defect states in polycrystalline diamonds.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The transformation of ethylene oxide (EO), propylene oxide (PO) and 1- butylene oxide (1-BuO) by human glutathione transferase theta (hGSTT1-1) was studied comparatively using 'conjugator' (GSTT1 + individuals) erythrocyte lysates. The relative sequence of velocity of enzymic transformation was PO > EO >> 1-BuO. The faster transformation of PO compared to EO was corroborated in studies with human and rat GSTT1-1 (hGSTT1-1 and rGSTT1-1, respectively) expressed by Salmonella typhimurium TA1535. This sequence of reactivities of homologous epoxides towards GSTT1-1 contrasts to the sequence observed in homologous alkyl halides (methyl bromide, MBr; ethyl bromide, EtBr; n-propyl bromide, PrBr) where the relative sequence MeBr >> EtBr > PrBr is observed. The higher reactivity towards GSTT1-1 of propylene oxide compared to ethylene oxide is consistent with a higher chemical reactivity. This is corroborated by experimental data of acid-catalysed hydrolysis of a number of aliphatic epoxides, including ethylene oxide and propylene oxide and consistent with semi-empirical molecular orbital modelings.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three new procedures - in the context of estimation of virial coefficients and summation of the partial virial series for hard discs and hard spheres - are proposed. They are based on the parametrised Euler transformation, a novel resummation, identity and the ε-convergence methods respectively. A comparison with other estimates (molecular dynamics, graph theory and empirical methods) reveals satisfactory agreement.