978 resultados para Log-normal degree distribution
Resumo:
Esta tese refere-se ao uso de modelos de attachment de lideres (Leader Progression Model – LPM) para estimativa da distancia de salto, que, junto com dados de densidade de raios como coletado pelo sistema de detecção de raios do SIPAM, são usados para estimar a taxa de flashover em linhas de transmissão de eletricidade na Amazônia. O modelo de progressão de líder desenvolvido nesta tese é denominado ModSalto, que, para estimar a distancia de salto: 1- integra a densidade linear de cargas elétricas devida ao líder, proporcional a prospectiva corrente de primeira descarga (1º stroke) Ip, para determinar o campo elétrico produzido pelo líder descendente na estrutura sob estudo (para-raios, arestas, condutores, etc.); 2 – integra a distribuição imagem do líder descendente, fazendo uso da característica de poder das pontas como fator de estimulo e intensificação do campo elétrico devido a streamers nas partes aguçadas das estruturas sob estudo, para temporizar o momento do processo de attachment. O gatilho para o líder descendente, por hipótese, se deve a ejeção aleatória de pacotes de cargas elétricas em domínios turbulentos no interior das nuvens que recebem energia por processos de cascata da turbulência geral, e o comportamento do líder descendente, deve obedecer à equação da força de Lorentz, no espaço de campos cruzados elétrico devido às nuvens a cima e o campo magnético da Terra, que obriga as cargas do líder a desenvolverem movimentos cicloidais que podem explicar a natureza tortuosa do trajeto do líder descendente. Com o objetivo de formalizar dados consistentes de densidade de raios é feita uma reanalise do conjunto de dados coletado pelo LLS SIPAM de outubro de 2006 a julho de 2008 na região amazônica, com cerca de 3 milhões de eventos, comparando-os com dados de torres instrumentadas para evidenciar-se sua qualidade e usabilidade. Dados de elevação de terreno do SRTM da NASA são usados para gerar formula do raio de atração (Ra) das estruturas passiveis de serem atingidas por raios e para gerar formulas de área de atração, usadas para quantificar o numero de raios que provavelmente atingirão determinada estrutura, baseado no valor de densidade de raios (raios/km2/ano) na área em estudo.
Resumo:
No presente trabalho foram analisadas as distribuições de probabilidade teórica Normal, Log-Normal, Exponencial, Gama e Weibull, para estimar as vazões máximas e mínimas anuais para o rio Paraguai, utilizando dados de uma estação fluviométrica localizado na cidade de Cáceres-MT. Os testes de Kolmogorov-Smirnov e Qui-Quadrado foram utilizados para verificar a aderência das probabilidades estimadas às freqüências observadas. A série utilizada compreende vazões máximas e mínimas anuais dos anos de 1966 a 2003, excluindo- se dados com falhas. Verificou-se que o melhor ajuste dos valores anuais de vazão máximas e mínimas são referentes a distribuição Gama e Weibull, respectivamente, conforme demonstrado pelo teste de Kolmogorov-Smirnov. Palavras-chave: Previsão de Vazão, teste de aderência.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
In this paper, we propose a random intercept Poisson model in which the random effect is assumed to follow a generalized log-gamma (GLG) distribution. This random effect accommodates (or captures) the overdispersion in the counts and induces within-cluster correlation. We derive the first two moments for the marginal distribution as well as the intraclass correlation. Even though numerical integration methods are, in general, required for deriving the marginal models, we obtain the multivariate negative binomial model from a particular parameter setting of the hierarchical model. An iterative process is derived for obtaining the maximum likelihood estimates for the parameters in the multivariate negative binomial model. Residual analysis is proposed and two applications with real data are given for illustration. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
For virtually all hospitals, utilization rates are a critical managerial indicator of efficiency and are determined in part by turnover time. Turnover time is defined as the time elapsed between surgeries, during which the operating room is cleaned and preparedfor the next surgery. Lengthier turnover times result in lower utilization rates, thereby hindering hospitals’ ability to maximize the numbers of patients that can be attended to. In this thesis, we analyze operating room data from a two year period provided byEvangelical Community Hospital in Lewisburg, Pennsylvania, to understand the variability of the turnover process. From the recorded data provided, we derive our best estimation of turnover time. Recognizing the importance of being able to properly modelturnover times in order to improve the accuracy of scheduling, we seek to fit distributions to the set of turnover times. We find that log-normal and log-logistic distributions are well-suited to turnover times, although further research must validate this finding. Wepropose that the choice of distribution depends on the hospital and, as a result, a hospital must choose whether to use the log-normal or the log-logistic distribution. Next, we use statistical tests to identify variables that may potentially influence turnover time. We find that there does not appear to be a correlation between surgerytime and turnover time across doctors. However, there are statistically significant differences between the mean turnover times across doctors. The final component of our research entails analyzing and explaining the benefits of introducing control charts as a quality control mechanism for monitoring turnover times in hospitals. Although widely instituted in other industries, control charts are notwidely adopted in healthcare environments, despite their potential benefits. A major component of our work is the development of control charts to monitor the stability of turnover times. These charts can be easily instituted in hospitals to reduce the variabilityof turnover times. Overall, our analysis uses operations research techniques to analyze turnover times and identify manners for improvement in lowering the mean turnover time and thevariability in turnover times. We provide valuable insight into a component of the surgery process that has received little attention, but can significantly affect utilization rates in hospitals. Most critically, an ability to more accurately predict turnover timesand a better understanding of the sources of variability can result in improved scheduling and heightened hospital staff and patient satisfaction. We hope that our findings can apply to many other hospital settings.
Resumo:
A variety of occupational hazards are indigenous to academic and research institutions, ranging from traditional life safety concerns, such as fire safety and fall protection, to specialized occupational hygiene issues such as exposure to carcinogenic chemicals, radiation sources, and infectious microorganisms. Institutional health and safety programs are constantly challenged to establish and maintain adequate protective measures for this wide array of hazards. A unique subset of academic and research institutions are classified as historically Black universities which provide educational opportunities primarily to minority populations. State funded minority schools receive less resources than their non-minority counterparts, resulting in a reduced ability to provide certain programs and services. Comprehensive health and safety services for these institutions may be one of the services compromised, resulting in uncontrolled exposures to various workplace hazards. Such a result would also be contrary to the national health status objectives to improve preventive health care measures for minority populations.^ To determine if differences exist, a cross-sectional survey was performed to evaluate the relative status of health and safety programs present within minority and non-minority state-funded academic and research institutions. Data were obtained from direct mail questionnaires, supplemented by data from publicly available sources. Parameters for comparison included reported numbers of full and part-time health and safety staff, reported OSHA 200 log (or equivalent) values, and reported workers compensation experience modifiers. The relative impact of institutional minority status, institution size, and OSHA regulatory environment, was also assessed. Additional health and safety program descriptors were solicited in an attempt to develop a preliminary profile of the hazards present in this unique work setting.^ Survey forms were distributed to 24 minority and 51 non-minority institutions. A total of 72% of the questionnaires were returned, with 58% of the minority and 78% of the non-minority institutions participating. The mean number of reported full-time health and safety staff for the responding minority institutions was determined to be 1.14, compared to 3.12 for the responding non-minority institutions. Data distribution variances were stabilized using log-normal transformations, and although subsequent analysis indicated statistically significant differences, the differences were found to be predicted by institution size only, and not by minority status or OSHA regulatory environment. Similar results were noted for estimated full-time equivalent health and safety staffing levels. Significant differences were not noted between reported OSHA 200 log (or equivalent) data, and a lack of information provided on workers compensation experience modifiers prevented comparisons on insurance premium expenditures. Other health and safety program descriptive information obtained served to validate the study's presupposition that the inclusion criteria would encompass those organizations with occupational risks from all four major hazard categories. Worker medical surveillance programs appeared to exist at most institutions, but the specific tests completed were not readily identifiable.^ The results of this study serve as a preliminary description of the health and safety programs for a unique set of workplaces have not been previously investigated. Numerous opportunities for further research are noted, including efforts to quantify the relative amount of each hazard present, the further definition of the programs reported to be in place, determination of other means to measure health outcomes on campuses, and comparisons among other culturally diverse workplaces. ^
Resumo:
We measured condensation particle (CP) concentrations and particle size distributions at the coastal Antarctic station Neumayer (70°39'S, 8°15'W) during two summer campaigns (from 20 January to 26 March 2012 and 1 February to 30 April 2014) and during polar night between 12 August and 27 September 2014 in the particle diameter (Dp) range from 2.94 nm to 60.4 nm (2012) and from 6.26 nm to 212.9 nm (2014). During both summer campaigns we identified all in all 44 new particle formation (NPF) events. From 10 NPF events, particle growth rates could be determined to be around 0.90±0.46 nm/h (mean ± std; range: 0.4 nm/h to 1.9 nm/h). With the exception of one case, particle growth was generally restricted to the nucleation mode (Dp < 25 nm) and the duration of NPF events was typically around 6.0±1.5 h (mean ± std; range: 4 h to 9 h). Thus in the main, particles did not grow up to sizes required for acting as cloud condensation nuclei. NPF during summer usually occurred in the afternoon in coherence with local photochemistry. During winter, two NPF events could be detected, though showing no ascertainable particle growth. A simple estimation indicated that apart from sulfuric acid, the derived growth rates required other low volatile precursor vapours.
Resumo:
El presente proyecto de fin de carrera describe y analiza el estudio integral del efecto de las vibraciones producidas por voladuras superficiales realizadas en el proyecto del “Tercer Juego de Esclusas” ejecutado para la Expansión del Canal de Panamá. Se recopilan un total de 53 registros, data generada por el monitoreo de 7 sismógrafos en 10 voladuras de producción realizadas en el año 2010. El fenómeno vibratorio tiene dos parámetros fundamentales, la velocidad pico-partícula (PPV) y la frecuencia dominante, los cuales caracterizan cuan dañino puede ser éste frente a su influencia sobre las estructuras civiles; por ello, se pretende caracterizarlas y fundamentalmente predecirlas, lo que permitirá su debido control. En función a lo expuesto, el estudio realizado consta de dos partes, la primera describe el comportamiento del terreno mediante la estimación de la ley de atenuación de la velocidad pico-partícula a través del uso de la regresión lineal por mínimos cuadrados; la segunda detalla un procedimiento validable para la predicción de la frecuencia dominante y del pseudo-espectro de respuesta de velocidad (PVRS) basada en la teoría de Newmark & Hall. Se ha obtenido: (i) la ley de atenuación del terreno para distintos grados de fiabilidad, (ii) herramientas de diseño de voladuras basadas en la relación de carga – distancia, (iii) la demostración que los valores de PPV se ajustan a una distribución log-normal, (iv) el mapa de isolíneas de PPV para el área de estudio, (v) una técnica detallada y válida para la predicción de la frecuencia dominante y del espectro de respuesta, (vi) formulaciones matemáticas de los factores de amplificación para el desplazamiento, velocidad y aceleración, (vii) mapa de isolíneas de amplificación para el área de estudio. A partir de los resultados obtenidos se proporciona información útil para su uso en el diseño y control de las voladuras posteriores del proyecto. ABSTRACT This project work describes and analyzes the comprehensive study of the effect of the vibrations produced by surface blasting carried out in the "Third Set of Locks" project executed for the expansion of the Panama Canal. A total of 53 records were collected, with the data generated by the monitoring of 7 seismographs in 10 production blasts carried out in 2010. The vibratory phenomenon has two fundamental parameters, the peak-particle velocity (PPV) and the dominant frequency, which characterize how damaging this can be compared to their influence on structures, which is why this is intended to characterize and predict fundamentally, that which allows proper control. Based on the above, the study consists of two parts; the first describes the behavior of the terrain by estimating the attenuation law for peak-particle velocity by using the ordinary least squares regression analysis, the second details a validable procedure for the prediction of the dominant frequency and pseudo-velocity response spectrum (PVRS) based on the theory of Newmark & Hall. The following have been obtained: (i) the attenuation law of the terrain for different degrees of reliability, (ii) blast design tools based on charge-distance ratio, (iii) the demonstration that the values of PPV conform to a log-normal distribution, (iv) the map of isolines of PPV for the area of study (v) detailed and valid technique for predicting the dominant frequency response spectrum, (vi) mathematical formulations of the amplification factors for displacement, velocity and acceleration, (vii) amplification of isolines map for the study area. From the results obtained, the study provides useful information for use in the design and control of blasting for subsequent projects.
Resumo:
La presente investigación tiene como objetivo principal diseñar un Modelo de Gestión de Riesgos Operacionales (MGRO) según las Directrices de los Acuerdos II y III del Comité de Supervisión Bancaria de Basilea del Banco de Pagos Internacionales (CSBB-BPI). Se considera importante realizar un estudio sobre este tema dado que son los riesgos operacionales (OpR) los responsables en gran medida de las últimas crisis financieras mundiales y por la dificultad para detectarlos en las organizaciones. Se ha planteado un modelo de gestión subdividido en dos vías de influencias. La primera acoge el paradigma holístico en el que se considera que hay múltiples maneras de percibir un proceso cíclico, así como las herramientas para observar, conocer y entender el objeto o sujeto percibido. La segunda vía la representa el paradigma totalizante, en el que se obtienen datos tanto cualitativos como cuantitativos, los cuales son complementarios entre si. Por otra parte, este trabajo plantea el diseño de un programa informático de OpR Cualitativo, que ha sido diseñado para determinar la raíz de los riesgos en las organizaciones y su Valor en Riesgo Operacional (OpVaR) basado en el método del indicador básico. Aplicando el ciclo holístico al caso de estudio, se obtuvo el siguiente diseño de investigación: no experimental, univariable, transversal descriptiva, contemporánea, retrospectiva, de fuente mixta, cualitativa (fenomenológica y etnográfica) y cuantitativa (descriptiva y analítica). La toma de decisiones y recolección de información se realizó en dos fases en la unidad de estudio. En la primera se tomó en cuenta la totalidad de la empresa Corpoelec-EDELCA, en la que se presentó un universo estadístico de 4271 personas, una población de 2390 personas y una unidad de muestreo de 87 personas. Se repitió el proceso en una segunda fase, para la Central Hidroeléctrica Simón Bolívar, y se determinó un segundo universo estadístico de 300 trabajadores, una población de 191 personas y una muestra de 58 profesionales. Como fuentes de recolección de información se utilizaron fuentes primarias y secundarias. Para recabar la información primaria se realizaron observaciones directas, dos encuestas para detectar las áreas y procesos con mayor nivel de riesgos y se diseñó un cuestionario combinado con otra encuesta (ad hoc) para establecer las estimaciones de frecuencia y severidad de pérdidas operacionales. La información de fuentes secundarias se extrajo de las bases de datos de Corpoelec-EDELCA, de la IEA, del Banco Mundial, del CSBB-BPI, de la UPM y de la UC at Berkeley, entre otras. Se establecieron las distribuciones de frecuencia y de severidad de pérdidas operacionales como las variables independientes y el OpVaR como la variable dependiente. No se realizó ningún tipo de seguimiento o control a las variables bajo análisis, ya que se consideraron estas para un instante especifico y solo se determinan con la finalidad de establecer la existencia y valoración puntual de los OpR en la unidad de estudio. El análisis cualitativo planteado en el MGRO, permitió detectar que en la unidad de investigación, el 67% de los OpR detectados provienen de dos fuentes principales: procesos (32%) y eventos externos (35%). Adicionalmente, la validación del MGRO en Corpoelec-EDELCA, permitió detectar que el 63% de los OpR en la organización provienen de tres categorías principales, siendo los fraudes externos los presentes con mayor regularidad y severidad de pérdidas en la organización. La exposición al riesgo se determinó fundamentándose en la adaptación del concepto de OpVaR que generalmente se utiliza para series temporales y que en el caso de estudio presenta la primicia de aplicarlo a datos cualitativos transformados con la escala Likert. La posibilidad de utilizar distribuciones de probabilidad típicas para datos cuantitativos en distribuciones de frecuencia y severidad de pérdidas con datos de origen cualitativo fueron analizadas. Para el 64% de los OpR estudiados se obtuvo que la frecuencia tiene un comportamiento semejante al de la distribución de probabilidad de Poisson y en un 55% de los casos para la severidad de pérdidas se obtuvo a las log-normal como las distribuciones de probabilidad más comunes, con lo que se concluyó que los enfoques sugeridos por el BCBS-BIS para series de tiempo son aplicables a los datos cualitativos. Obtenidas las distribuciones de frecuencia y severidad de pérdidas, se convolucionaron estas implementando el método de Montecarlo, con lo que se obtuvieron los enfoques de distribuciones de pérdidas (LDA) para cada uno de los OpR. El OpVaR se dedujo como lo sugiere el CSBB-BPI del percentil 99,9 o 99% de cada una de las LDA, obteniéndose que los OpR presentan un comportamiento similar al sistema financiero, resultando como los de mayor peligrosidad los que se ubican con baja frecuencia y alto impacto, por su dificultad para ser detectados y monitoreados. Finalmente, se considera que el MGRO permitirá a los agentes del mercado y sus grupos de interés conocer con efectividad, fiabilidad y eficiencia el status de sus entidades, lo que reducirá la incertidumbre de sus inversiones y les permitirá establecer una nueva cultura de gestión en sus organizaciones. ABSTRACT This research has as main objective the design of a Model for Operational Risk Management (MORM) according to the guidelines of Accords II and III of the Basel Committee on Banking Supervision of the Bank for International Settlements (BCBS- BIS). It is considered important to conduct a study on this issue since operational risks (OpR) are largely responsible for the recent world financial crisis and due to the difficulty in detecting them in organizations. A management model has been designed which is divided into two way of influences. The first supports the holistic paradigm in which it is considered that there are multiple ways of perceiving a cyclical process and contains the tools to observe, know and understand the subject or object perceived. The second way is the totalizing paradigm, in which both qualitative and quantitative data are obtained, which are complementary to each other. Moreover, this paper presents the design of qualitative OpR software which is designed to determine the root of risks in organizations and their Operational Value at Risk (OpVaR) based on the basic indicator approach. Applying the holistic cycle to the case study, the following research design was obtained: non- experimental, univariate, descriptive cross-sectional, contemporary, retrospective, mixed-source, qualitative (phenomenological and ethnographic) and quantitative (descriptive and analytical). Decision making and data collection was conducted in two phases in the study unit. The first took into account the totality of the Corpoelec-EDELCA company, which presented a statistical universe of 4271 individuals, a population of 2390 individuals and a sampling unit of 87 individuals. The process was repeated in a second phase to the Simon Bolivar Hydroelectric Power Plant, and a second statistical universe of 300 workers, a population of 191 people and a sample of 58 professionals was determined. As sources of information gathering primary and secondary sources were used. To obtain the primary information direct observations were conducted and two surveys to identify the areas and processes with higher risks were designed. A questionnaire was combined with an ad hoc survey to establish estimates of frequency and severity of operational losses was also considered. The secondary information was extracted from the databases of Corpoelec-EDELCA, IEA, the World Bank, the BCBS-BIS, UPM and UC at Berkeley, among others. The operational loss frequency distributions and the operational loss severity distributions were established as the independent variables and OpVaR as the dependent variable. No monitoring or control of the variables under analysis was performed, as these were considered for a specific time and are determined only for the purpose of establishing the existence and timely assessment of the OpR in the study unit. Qualitative analysis raised in the MORM made it possible to detect that in the research unit, 67% of detected OpR come from two main sources: external processes (32%) and external events (35%). Additionally, validation of the MORM in Corpoelec-EDELCA, enabled to estimate that 63% of OpR in the organization come from three main categories, with external fraud being present more regularly and greater severity of losses in the organization. Risk exposure is determined basing on adapting the concept of OpVaR generally used for time series and in the case study it presents the advantage of applying it to qualitative data transformed with the Likert scale. The possibility of using typical probability distributions for quantitative data in loss frequency and loss severity distributions with data of qualitative origin were analyzed. For the 64% of OpR studied it was found that the frequency has a similar behavior to that of the Poisson probability distribution and 55% of the cases for loss severity it was found that the log-normal were the most common probability distributions. It was concluded that the approach suggested by the BCBS-BIS for time series can be applied to qualitative data. Once obtained the distributions of loss frequency and severity have been obtained they were subjected to convolution implementing the Monte Carlo method. Thus the loss distribution approaches (LDA) were obtained for each of the OpR. The OpVaR was derived as suggested by the BCBS-BIS 99.9 percentile or 99% of each of the LDA. It was determined that the OpR exhibits a similar behavior to the financial system, being the most dangerous those with low frequency and high impact for their difficulty in being detected and monitored. Finally, it is considered that the MORM will allows market players and their stakeholders to know with effectiveness, efficiency and reliability the status of their entities, which will reduce the uncertainty of their investments and enable them to establish a new management culture in their organizations.
Resumo:
En la presente Tesis se ha llevado a cabo el contraste y desarrollo de metodologías que permitan mejorar el cálculo de las avenidas de proyecto y extrema empleadas en el cálculo de la seguridad hidrológica de las presas. En primer lugar se ha abordado el tema del cálculo de las leyes de frecuencia de caudales máximos y su extrapolación a altos periodos de retorno. Esta cuestión es de gran relevancia, ya que la adopción de estándares de seguridad hidrológica para las presas cada vez más exigentes, implica la utilización de periodos de retorno de diseño muy elevados cuya estimación conlleva una gran incertidumbre. Es importante, en consecuencia incorporar al cálculo de los caudales de diseño todas la técnicas disponibles para reducir dicha incertidumbre. Asimismo, es importante hacer una buena selección del modelo estadístico (función de distribución y procedimiento de ajuste) de tal forma que se garantice tanto su capacidad para describir el comportamiento de la muestra, como para predecir de manera robusta los cuantiles de alto periodo de retorno. De esta forma, se han realizado estudios a escala nacional con el objetivo de determinar el esquema de regionalización que ofrece mejores resultados para las características hidrológicas de las cuencas españolas, respecto a los caudales máximos anuales, teniendo en cuenta el numero de datos disponibles. La metodología utilizada parte de la identificación de regiones homogéneas, cuyos límites se han determinado teniendo en cuenta las características fisiográficas y climáticas de las cuencas, y la variabilidad de sus estadísticos, comprobando posteriormente su homogeneidad. A continuación, se ha seleccionado el modelo estadístico de caudales máximos anuales con un mejor comportamiento en las distintas zonas de la España peninsular, tanto para describir los datos de la muestra como para extrapolar a los periodos de retorno más altos. El proceso de selección se ha basado, entre otras cosas, en la generación sintética de series de datos mediante simulaciones de Monte Carlo, y el análisis estadístico del conjunto de resultados obtenido a partir del ajuste de funciones de distribución a estas series bajo distintas hipótesis. Posteriormente, se ha abordado el tema de la relación caudal-volumen y la definición de los hidrogramas de diseño en base a la misma, cuestión que puede ser de gran importancia en el caso de presas con grandes volúmenes de embalse. Sin embargo, los procedimientos de cálculo hidrológico aplicados habitualmente no tienen en cuenta la dependencia estadística entre ambas variables. En esta Tesis se ha desarrollado un procedimiento para caracterizar dicha dependencia estadística de una manera sencilla y robusta, representando la función de distribución conjunta del caudal punta y el volumen en base a la función de distribución marginal del caudal punta y la función de distribución condicionada del volumen respecto al caudal. Esta última se determina mediante una función de distribución log-normal, aplicando un procedimiento de ajuste regional. Se propone su aplicación práctica a través de un procedimiento de cálculo probabilístico basado en la generación estocástica de un número elevado de hidrogramas. La aplicación a la seguridad hidrológica de las presas de este procedimiento requiere interpretar correctamente el concepto de periodo de retorno aplicado a variables hidrológicas bivariadas. Para ello, se realiza una propuesta de interpretación de dicho concepto. El periodo de retorno se entiende como el inverso de la probabilidad de superar un determinado nivel de embalse. Al relacionar este periodo de retorno con las variables hidrológicas, el hidrograma de diseño de la presa deja de ser un único hidrograma para convertirse en una familia de hidrogramas que generan un mismo nivel máximo en el embalse, representados mediante una curva en el plano caudal volumen. Esta familia de hidrogramas de diseño depende de la propia presa a diseñar, variando las curvas caudal-volumen en función, por ejemplo, del volumen de embalse o la longitud del aliviadero. El procedimiento propuesto se ilustra mediante su aplicación a dos casos de estudio. Finalmente, se ha abordado el tema del cálculo de las avenidas estacionales, cuestión fundamental a la hora de establecer la explotación de la presa, y que puede serlo también para estudiar la seguridad hidrológica de presas existentes. Sin embargo, el cálculo de estas avenidas es complejo y no está del todo claro hoy en día, y los procedimientos de cálculo habitualmente utilizados pueden presentar ciertos problemas. El cálculo en base al método estadístico de series parciales, o de máximos sobre un umbral, puede ser una alternativa válida que permite resolver esos problemas en aquellos casos en que la generación de las avenidas en las distintas estaciones se deba a un mismo tipo de evento. Se ha realizado un estudio con objeto de verificar si es adecuada en España la hipótesis de homogeneidad estadística de los datos de caudal de avenida correspondientes a distintas estaciones del año. Asimismo, se han analizado los periodos estacionales para los que es más apropiado realizar el estudio, cuestión de gran relevancia para garantizar que los resultados sean correctos, y se ha desarrollado un procedimiento sencillo para determinar el umbral de selección de los datos de tal manera que se garantice su independencia, una de las principales dificultades en la aplicación práctica de la técnica de las series parciales. Por otra parte, la aplicación practica de las leyes de frecuencia estacionales requiere interpretar correctamente el concepto de periodo de retorno para el caso estacional. Se propone un criterio para determinar los periodos de retorno estacionales de forma coherente con el periodo de retorno anual y con una distribución adecuada de la probabilidad entre las distintas estaciones. Por último, se expone un procedimiento para el cálculo de los caudales estacionales, ilustrándolo mediante su aplicación a un caso de estudio. The compare and develop of a methodology in order to improve the extreme flow estimation for dam hydrologic security has been developed. First, the work has been focused on the adjustment of maximum peak flows distribution functions from which to extrapolate values for high return periods. This has become a major issue as the adoption of stricter standards on dam hydrologic security involves estimation of high design return periods which entails great uncertainty. Accordingly, it is important to incorporate all available techniques for the estimation of design peak flows in order to reduce this uncertainty. Selection of the statistical model (distribution function and adjustment method) is also important since its ability to describe the sample and to make solid predictions for high return periods quantiles must be guaranteed. In order to provide practical application of previous methodologies, studies have been developed on a national scale with the aim of determining a regionalization scheme which features best results in terms of annual maximum peak flows for hydrologic characteristics of Spanish basins taking into account the length of available data. Applied methodology starts with the delimitation of regions taking into account basin’s physiographic and climatic characteristics and the variability of their statistical properties, and continues with their homogeneity testing. Then, a statistical model for maximum annual peak flows is selected with the best behaviour for the different regions in peninsular Spain in terms of describing sample data and making solid predictions for high return periods. This selection has been based, among others, on synthetic data series generation using Monte Carlo simulations and statistical analysis of results from distribution functions adjustment following different hypothesis. Secondly, the work has been focused on the analysis of the relationship between peak flow and volume and how to define design flood hydrographs based on this relationship which can be highly important for large volume reservoirs. However, commonly used hydrologic procedures do not take statistical dependence between these variables into account. A simple and sound method for statistical dependence characterization has been developed by the representation of a joint distribution function of maximum peak flow and volume which is based on marginal distribution function of peak flow and conditional distribution function of volume for a given peak flow. The last one is determined by a regional adjustment procedure of a log-normal distribution function. Practical application is proposed by a probabilistic estimation procedure based on stochastic generation of a large number of hydrographs. The use of this procedure for dam hydrologic security requires a proper interpretation of the return period concept applied to bivariate hydrologic data. A standard is proposed in which it is understood as the inverse of the probability of exceeding a determined reservoir level. When relating return period and hydrological variables the only design flood hydrograph changes into a family of hydrographs which generate the same maximum reservoir level and that are represented by a curve in the peak flow-volume two-dimensional space. This family of design flood hydrographs depends on the dam characteristics as for example reservoir volume or spillway length. Two study cases illustrate the application of the developed methodology. Finally, the work has been focused on the calculation of seasonal floods which are essential when determining the reservoir operation and which can be also fundamental in terms of analysing the hydrologic security of existing reservoirs. However, seasonal flood calculation is complex and nowadays it is not totally clear. Calculation procedures commonly used may present certain problems. Statistical partial duration series, or peaks over threshold method, can be an alternative approach for their calculation that allow to solve problems encountered when the same type of event is responsible of floods in different seasons. A study has been developed to verify the hypothesis of statistical homogeneity of peak flows for different seasons in Spain. Appropriate seasonal periods have been analyzed which is highly relevant to guarantee correct results. In addition, a simple procedure has been defined to determine data selection threshold on a way that ensures its independency which is one of the main difficulties in practical application of partial series. Moreover, practical application of seasonal frequency laws requires a correct interpretation of the concept of seasonal return period. A standard is proposed in order to determine seasonal return periods coherently with the annual return period and with an adequate seasonal probability distribution. Finally a methodology is proposed to calculate seasonal peak flows. A study case illustrates the application of the proposed methodology.
Resumo:
One saw previously that indications of diversity IT and the one of Shannon permits to characterize globally by only one number one fundamental aspects of the text structure. However a more precise knowledge of this structure requires specific abundance distributions and the use, to represent this one, of a suitable mathematical model. Among the numerous models that would be either susceptible to be proposed, the only one that present a real convenient interest are simplest. One will limit itself to study applied three of it to the language L(MT): the log-linear, the log-normal and Mac Arthur's models very used for the calculation of the diversity of the species of ecosystems, and used, we believe that for the first time, in the calculation of the diversity of a text written in a certain language, in our case L(MT). One will show advantages and inconveniences of each of these model types, methods permitting to adjust them to text data and in short tests that permit to decide if this adjustment is acceptable.
Resumo:
Methane hydrates are present in marine seep systems and occur within the gas hydrate stability zone. Very little is known about their crystallite sizes and size distributions because they are notoriously difficult to measure. Crystal size distributions are usually considered as one of the key petrophysical parameters because they influence mechanical properties and possible compositional changes, which may occur with changing environmental conditions. Variations in grain size are relevant for gas substitution in natural hydrates by replacing CH4 with CO2 for the purpose of carbon dioxide sequestration. Here we show that crystallite sizes of gas hydrates from some locations in the Indian Ocean, Gulf of Mexico and Black Sea are in the range of 200-400 µm; larger values were obtained for deeper-buried samples from ODP Leg 204. The crystallite sizes show generally a log-normal distribution and appear to vary sometimes rapidly with location.
Resumo:
Deposition of insoluble prion protein (PrP) in the brain in the form of protein aggregates or deposits is characteristic of the ‘transmissible spongiform encephalopathies’ (TSEs). Understanding the growth and development of these PrP aggregates is important both in attempting to the elucidate of the pathogenesis of prion disease and in the development of treatments designed to prevent or inhibit the spread of prion pathology within the brain. Aggregation and disaggregation of proteins and the diffusion of substances into the developing aggregates (surface diffusion) are important factors in the development of protein aggregates. Mathematical models suggest that if aggregation/disaggregation or surface diffusion is the predominant factor, the size frequency distribution of the resulting protein aggregates in the brain should be described by either a power-law or a log-normal model respectively. This study tested this hypothesis for two different types of PrP deposit, viz., the diffuse and florid-type PrP deposits in patients with variant Creutzfeldt-Jakob disease (vCJD). The size distributions of the florid and diffuse plaques were fitted by a power-law function in 100% and 42% of brain areas studied respectively. By contrast, the size distributions of both types of plaque deviated significantly from a log-normal model in all brain areas. Hence, protein aggregation and disaggregation may be the predominant factor in the development of the florid plaques. A more complex combination of factors appears to be involved in the pathogenesis of the diffuse plaques. These results may be useful in the design of treatments to inhibit the development of protein aggregates in vCJD.