11 resultados para CASE SERIES

em Universidad Politécnica de Madrid


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Studies of patients with temporal lobe epilepsy provide few descriptions of seizures that arise in the temporopolar and the anterior temporobasal brain region. Based on connectivity, it might be assumed that the semiology of these seizures is similar to that of medial temporal lobe epilepsy. However, accumulating evidence suggests that the anterior temporobasal cortex may play an important role in the language system, which could account for particular features of seizures arising here. We studied the electroclinical features of seizures in patients with circumscribed temporopolar and temporobasal lesions in order to identify specific features that might differentiate them from seizures that originate in other temporal areas. Among 172 patients with temporal lobe seizures registered in our epilepsy unit in the last 15 years, 15 (8.7%) patients had seizures caused by temporopolar or anterior temporobasal lesions (11 left-sided lesions). The main finding in our study is that patients with left-sided lesions had aphasia during their seizures as the most prominent feature. In addition, while all patients showed normal to high intellectual functioning in standard neuropsychological testing, semantic impairment was found in a subset of 9 patients with left-sided lesions. This case series demonstrates that aphasic seizures without impairment of consciousness can result from small, circumscribed left anterior temporobasal and temporopolar lesions. Thus, the presence of speech manifestation during seizures should prompt detailed assessment of the structural integrity of the basal surface of the temporal lobe in addition to the evaluation of primary language areas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The classical Kramer sampling theorem provides a method for obtaining orthogonal sampling formulas. In particular, when the involved kernel is analytic in the sampling parameter it can be stated in an abstract setting of reproducing kernel Hilbert spaces of entire functions which includes as a particular case the classical Shannon sampling theory. This abstract setting allows us to obtain a sort of converse result and to characterize when the sampling formula associated with an analytic Kramer kernel can be expressed as a Lagrange-type interpolation series. On the other hand, the de Branges spaces of entire functions satisfy orthogonal sampling formulas which can be written as Lagrange-type interpolation series. In this work some links between all these ideas are established.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crop irrigation is a major consumer of energy. Only a few countries are self-sufficient in conventional non-renewable energy sources. Fortunately, there are renewable ones, such as wind, which has experienced recent developments in the area of power generation. Wind pumps can play a vital role in irrigation projects in remote farms. A methodology based on daily estimation balance between water needs and water availability was used to evaluate the feasibility of the most economic windmill irrigation system. For this purpose, several factors were included: three-hourly wind velocity (W3 h, m/s), flow supplied by the wind pump as a function of the elevation height (H, m) and daily greenhouse evapotranspiration as a function of crop planting date. Monthly volumes of water required for irrigation (Dr, m3/ha) and in the water tank (Vd, m3), as well as the monthly irrigable area (Ar, ha), were estimated by cumulative deficit water budgeting taking in account these factors. An example is given illustrating the use of this methodology on tomato crop (Lycopersicon esculentum Mill.) under greenhouse at Ciego de Ávila, Cuba. In this case two different W3 h series (average and low wind year), three different H values and five tomato crop planting dates were considered. The results show that the optimum period of wind-pump driven irrigation is with crop plating in November, recommending a 5 m3 volume tank for cultivated areas around 0.2 ha when using wind pumps operating at 15 m of height elevation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a method to measure real-valued time series irreversibility which combines two different tools: the horizontal visibility algorithm and the Kullback-Leibler divergence. This method maps a time series to a directed network according to a geometric criterion. The degree of irreversibility of the series is then estimated by the Kullback-Leibler divergence (i.e. the distinguishability) between the in and out degree distributions of the associated graph. The method is computationally efficient and does not require any ad hoc symbolization process. We find that the method correctly distinguishes between reversible and irreversible stationary time series, including analytical and numerical studies of its performance for: (i) reversible stochastic processes (uncorrelated and Gaussian linearly correlated), (ii) irreversible stochastic processes (a discrete flashing ratchet in an asymmetric potential), (iii) reversible (conservative) and irreversible (dissipative) chaotic maps, and (iv) dissipative chaotic maps in the presence of noise. Two alternative graph functionals, the degree and the degree-degree distributions, can be used as the Kullback-Leibler divergence argument. The former is simpler and more intuitive and can be used as a benchmark, but in the case of an irreversible process with null net current, the degree-degree distribution has to be considered to identify the irreversible nature of the series

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doñana, a National Park since 1969, a UNESCO site since 1994 among other protected area designations of national and international character, is a coastal dune and marshland ecosystem of outstanding importance for biodiversity and conservation at the mouth of the Guadalaquivir River, Southwest Spain. However, the Doñana natural area is seriously threatened by global change factors such as humanly induced climate change, habitat loss, overexploitation of ecosystem services, and pollution. Not all stakeholders are convinced of the benefits of the national park, and management of Doñana, its environs and watershed are the subject of intense disagreement. This interplay between natural characteristics of great value with intense human pressure makes Doñana a fascinating workshop for the study of global human environment interactions. Here, we discuss the role of stakeholders in the application of a cellular automatabased model to Doñana and its environs and present the results of a series of exercises undertaken with stakeholders to parametrize the model, something often done by researchers without stakeholder engagement. By engaging with stakeholders early in the project, feedback generated from workshops contributes to model development. Stakeholders are therefore contributors of empirical data for the model as well as independent evaluators providing local and specialist knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new methodology to build parametric models to estimate global solar irradiation adjusted to specific on-site characteristics based on the evaluation of variable im- portance. Thus, those variables higly correlated to solar irradiation on a site are implemented in the model and therefore, different models might be proposed under different climates. This methodology is applied in a study case in La Rioja region (northern Spain). A new model is proposed and evaluated on stability and accuracy against a review of twenty-two already exist- ing parametric models based on temperatures and rainfall in seventeen meteorological stations in La Rioja. The methodology of model evaluation is based on bootstrapping, which leads to achieve a high level of confidence in model calibration and validation from short time series (in this case five years, from 2007 to 2011). The model proposed improves the estimates of the other twenty-two models with average mean absolute error (MAE) of 2.195 MJ/m2 day and average confidence interval width (95% C.I., n=100) of 0.261 MJ/m2 day. 41.65% of the daily residuals in the case of SIAR and 20.12% in that of SOS Rioja fall within the uncertainty tolerance of the pyranometers of the two networks (10% and 5%, respectively). Relative differences between measured and estimated irradiation on an annual cumulative basis are below 4.82%. Thus, the proposed model might be useful to estimate annual sums of global solar irradiation, reaching insignificant differences between measurements from pyranometers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este trabajo aborda el problema de modelizar sistemas din´amicos reales a partir del estudio de sus series temporales, usando una formulaci´on est´andar que pretende ser una abstracci´on universal de los sistemas din´amicos, independientemente de su naturaleza determinista, estoc´astica o h´ıbrida. Se parte de modelizaciones separadas de sistemas deterministas por un lado y estoc´asticos por otro, para converger finalmente en un modelo h´ıbrido que permite estudiar sistemas gen´ericos mixtos, esto es, que presentan una combinaci´on de comportamiento determinista y aleatorio. Este modelo consta de dos componentes, uno determinista consistente en una ecuaci´on en diferencias, obtenida a partir de un estudio de autocorrelaci´on, y otro estoc´astico que modeliza el error cometido por el primero. El componente estoc´astico es un generador universal de distribuciones de probabilidad, basado en un proceso compuesto de variables aleatorias, uniformemente distribuidas en un intervalo variable en el tiempo. Este generador universal es deducido en la tesis a partir de una nueva teor´ıa sobre la oferta y la demanda de un recurso gen´erico. El modelo resultante puede formularse conceptualmente como una entidad con tres elementos fundamentales: un motor generador de din´amica determinista, una fuente interna de ruido generadora de incertidumbre y una exposici´on al entorno que representa las interacciones del sistema real con el mundo exterior. En las aplicaciones estos tres elementos se ajustan en base al hist´orico de las series temporales del sistema din´amico. Una vez ajustados sus componentes, el modelo se comporta de una forma adaptativa tomando como inputs los nuevos valores de las series temporales del sistema y calculando predicciones sobre su comportamiento futuro. Cada predicci´on se presenta como un intervalo dentro del cual cualquier valor es equipro- bable, teniendo probabilidad nula cualquier valor externo al intervalo. De esta forma el modelo computa el comportamiento futuro y su nivel de incertidumbre en base al estado actual del sistema. Se ha aplicado el modelo en esta tesis a sistemas muy diferentes mostrando ser muy flexible para afrontar el estudio de campos de naturaleza dispar. El intercambio de tr´afico telef´onico entre operadores de telefon´ıa, la evoluci´on de mercados financieros y el flujo de informaci´on entre servidores de Internet son estudiados en profundidad en la tesis. Todos estos sistemas son modelizados de forma exitosa con un mismo lenguaje, a pesar de tratarse de sistemas f´ısicos totalmente distintos. El estudio de las redes de telefon´ıa muestra que los patrones de tr´afico telef´onico presentan una fuerte pseudo-periodicidad semanal contaminada con una gran cantidad de ruido, sobre todo en el caso de llamadas internacionales. El estudio de los mercados financieros muestra por su parte que la naturaleza fundamental de ´estos es aleatoria con un rango de comportamiento relativamente acotado. Una parte de la tesis se dedica a explicar algunas de las manifestaciones emp´ıricas m´as importantes en los mercados financieros como son los “fat tails”, “power laws” y “volatility clustering”. Por ´ultimo se demuestra que la comunicaci´on entre servidores de Internet tiene, al igual que los mercados financieros, una componente subyacente totalmente estoc´astica pero de comportamiento bastante “d´ocil”, siendo esta docilidad m´as acusada a medida que aumenta la distancia entre servidores. Dos aspectos son destacables en el modelo, su adaptabilidad y su universalidad. El primero es debido a que, una vez ajustados los par´ametros generales, el modelo se “alimenta” de los valores observables del sistema y es capaz de calcular con ellos comportamientos futuros. A pesar de tener unos par´ametros fijos, la variabilidad en los observables que sirven de input al modelo llevan a una gran riqueza de ouputs posibles. El segundo aspecto se debe a la formulaci´on gen´erica del modelo h´ıbrido y a que sus par´ametros se ajustan en base a manifestaciones externas del sistema en estudio, y no en base a sus caracter´ısticas f´ısicas. Estos factores hacen que el modelo pueda utilizarse en gran variedad de campos. Por ´ultimo, la tesis propone en su parte final otros campos donde se han obtenido ´exitos preliminares muy prometedores como son la modelizaci´on del riesgo financiero, los algoritmos de routing en redes de telecomunicaci´on y el cambio clim´atico. Abstract This work faces the problem of modeling dynamical systems based on the study of its time series, by using a standard language that aims to be an universal abstraction of dynamical systems, irrespective of their deterministic, stochastic or hybrid nature. Deterministic and stochastic models are developed separately to be merged subsequently into a hybrid model, which allows the study of generic systems, that is to say, those having both deterministic and random behavior. This model is a combination of two different components. One of them is deterministic and consisting in an equation in differences derived from an auto-correlation study and the other is stochastic and models the errors made by the deterministic one. The stochastic component is an universal generator of probability distributions based on a process consisting in random variables distributed uniformly within an interval varying in time. This universal generator is derived in the thesis from a new theory of offer and demand for a generic resource. The resulting model can be visualized as an entity with three fundamental elements: an engine generating deterministic dynamics, an internal source of noise generating uncertainty and an exposure to the environment which depicts the interactions between the real system and the external world. In the applications these three elements are adjusted to the history of the time series from the dynamical system. Once its components have been adjusted, the model behaves in an adaptive way by using the new time series values from the system as inputs and calculating predictions about its future behavior. Every prediction is provided as an interval, where any inner value is equally probable while all outer ones have null probability. So, the model computes the future behavior and its level of uncertainty based on the current state of the system. The model is applied to quite different systems in this thesis, showing to be very flexible when facing the study of fields with diverse nature. The exchange of traffic between telephony operators, the evolution of financial markets and the flow of information between servers on the Internet are deeply studied in this thesis. All these systems are successfully modeled by using the same “language”, in spite the fact that they are systems physically radically different. The study of telephony networks shows that the traffic patterns are strongly weekly pseudo-periodic but mixed with a great amount of noise, specially in the case of international calls. It is proved that the underlying nature of financial markets is random with a moderate range of variability. A part of this thesis is devoted to explain some of the most important empirical observations in financial markets, such as “fat tails”, “power laws” and “volatility clustering”. Finally it is proved that the communication between two servers on the Internet has, as in the case of financial markets, an underlaying random dynamics but with a narrow range of variability, being this lack of variability more marked as the distance between servers is increased. Two aspects of the model stand out as being the most important: its adaptability and its universality. The first one is due to the fact that once the general parameters have been adjusted , the model is “fed” on the observable manifestations of the system in order to calculate its future behavior. Despite the fact that the model has fixed parameters the variability in the observable manifestations of the system, which are used as inputs of the model, lead to a great variability in the possible outputs. The second aspect is due to the general “language” used in the formulation of the hybrid model and to the fact that its parameters are adjusted based on external manifestations of the system under study instead of its physical characteristics. These factors made the model suitable to be used in great variety of fields. Lastly, this thesis proposes other fields in which preliminary and promising results have been obtained, such as the modeling of financial risk, the development of routing algorithms for telecommunication networks and the assessment of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years there has been a heightened interest in data treatment and analysis with the aim of discovering hidden knowledge and eliciting relationships and patterns within this data. Data mining techniques (also known as Knowledge Discovery in Databases) have been applied over a wide range of fields such as marketing, investment, fraud detection, manufacturing, telecommunications and health. In this study, well-known data mining techniques such as artificial neural networks (ANN), genetic programming (GP), forward selection linear regression (LR) and k-means clustering techniques, are proposed to the health and sports community in order to aid with resistance training prescription. Appropriate resistance training prescription is effective for developing fitness, health and for enhancing general quality of life. Resistance exercise intensity is commonly prescribed as a percent of the one repetition maximum. 1RM, dynamic muscular strength, one repetition maximum or one execution maximum, is operationally defined as the heaviest load that can be moved over a specific range of motion, one time and with correct performance. The safety of the 1RM assessment has been questioned as such an enormous effort may lead to muscular injury. Prediction equations could help to tackle the problem of predicting the 1RM from submaximal loads, in order to avoid or at least, reduce the associated risks. We built different models from data on 30 men who performed up to 5 sets to exhaustion at different percentages of the 1RM in the bench press action, until reaching their actual 1RM. Also, a comparison of different existing prediction equations is carried out. The LR model seems to outperform the ANN and GP models for the 1RM prediction in the range between 1 and 10 repetitions. At 75% of the 1RM some subjects (n = 5) could perform 13 repetitions with proper technique in the bench press action, whilst other subjects (n = 20) performed statistically significant (p < 0:05) more repetitions at 70% than at 75% of their actual 1RM in the bench press action. Rate of perceived exertion (RPE) seems not to be a good predictor for 1RM when all the sets are performed until exhaustion, as no significant differences (p < 0:05) were found in the RPE at 75%, 80% and 90% of the 1RM. Also, years of experience and weekly hours of strength training are better correlated to 1RM (p < 0:05) than body weight. O'Connor et al. 1RM prediction equation seems to arise from the data gathered and seems to be the most accurate 1RM prediction equation from those proposed in literature and used in this study. Epley's 1RM prediction equation is reproduced by means of data simulation from 1RM literature equations. Finally, future lines of research are proposed related to the problem of the 1RM prediction by means of genetic algorithms, neural networks and clustering techniques. RESUMEN En los últimos años ha habido un creciente interés en el tratamiento y análisis de datos con el propósito de descubrir relaciones, patrones y conocimiento oculto en los mismos. Las técnicas de data mining (también llamadas de \Descubrimiento de conocimiento en bases de datos\) se han aplicado consistentemente a lo gran de un gran espectro de áreas como el marketing, inversiones, detección de fraude, producción industrial, telecomunicaciones y salud. En este estudio, técnicas bien conocidas de data mining como las redes neuronales artificiales (ANN), programación genética (GP), regresión lineal con selección hacia adelante (LR) y la técnica de clustering k-means, se proponen a la comunidad del deporte y la salud con el objetivo de ayudar con la prescripción del entrenamiento de fuerza. Una apropiada prescripción de entrenamiento de fuerza es efectiva no solo para mejorar el estado de forma general, sino para mejorar la salud e incrementar la calidad de vida. La intensidad en un ejercicio de fuerza se prescribe generalmente como un porcentaje de la repetición máxima. 1RM, fuerza muscular dinámica, una repetición máxima o una ejecución máxima, se define operacionalmente como la carga máxima que puede ser movida en un rango de movimiento específico, una vez y con una técnica correcta. La seguridad de las pruebas de 1RM ha sido cuestionada debido a que el gran esfuerzo requerido para llevarlas a cabo puede derivar en serias lesiones musculares. Las ecuaciones predictivas pueden ayudar a atajar el problema de la predicción de la 1RM con cargas sub-máximas y son empleadas con el propósito de eliminar o al menos, reducir los riesgos asociados. En este estudio, se construyeron distintos modelos a partir de los datos recogidos de 30 hombres que realizaron hasta 5 series al fallo en el ejercicio press de banca a distintos porcentajes de la 1RM, hasta llegar a su 1RM real. También se muestra una comparación de algunas de las distintas ecuaciones de predicción propuestas con anterioridad. El modelo LR parece superar a los modelos ANN y GP para la predicción de la 1RM entre 1 y 10 repeticiones. Al 75% de la 1RM algunos sujetos (n = 5) pudieron realizar 13 repeticiones con una técnica apropiada en el ejercicio press de banca, mientras que otros (n = 20) realizaron significativamente (p < 0:05) más repeticiones al 70% que al 75% de su 1RM en el press de banca. El ínndice de esfuerzo percibido (RPE) parece no ser un buen predictor del 1RM cuando todas las series se realizan al fallo, puesto que no existen diferencias signifiativas (p < 0:05) en el RPE al 75%, 80% y el 90% de la 1RM. Además, los años de experiencia y las horas semanales dedicadas al entrenamiento de fuerza están más correlacionadas con la 1RM (p < 0:05) que el peso corporal. La ecuación de O'Connor et al. parece surgir de los datos recogidos y parece ser la ecuación de predicción de 1RM más precisa de aquellas propuestas en la literatura y empleadas en este estudio. La ecuación de predicción de la 1RM de Epley es reproducida mediante simulación de datos a partir de algunas ecuaciones de predicción de la 1RM propuestas con anterioridad. Finalmente, se proponen futuras líneas de investigación relacionadas con el problema de la predicción de la 1RM mediante algoritmos genéticos, redes neuronales y técnicas de clustering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este Trabajo de Fin de Grado recoge el diseño e implementación de un compilador y una librería de entorno de ejecución para el lenguaje específico del dominio TESL, un lenguaje de alto nivel para el análisis de series temporales diseñado por un grupo de investigación de la Universidad Politécnica de Madrid. Este compilador es el primer compilador completo disponible para TESL y sirve como base para la continuación del desarrollo del lenguaje, estando ideado para permitir su adaptación a cambios en el mismo. El compilador ha sido implementado en Java siguiendo la arquitectura clásica para este tipo de aplicaciones, incluyendo un Analizador Léxico, Sintáctico y Semántico, así como un Generador de Código. Se ha documentado su arquitectura y las decisiones de diseño que han conducido a la misma. Además, se ha demostrado su funcionamiento con un caso práctico de análisis de eventos en métricas de servidores. Por último, se ha documentado el lenguaje TESL, en cuyo desarrollo se ha colaborado. ---ABSTRACT---This Bachelor’s Thesis describes the design and implementation of a compiler and a runtime library for the domain-specific language TESL, a high-level language for analyzing time series events developed by a research group from the Technical University of Madrid. This is the first fully implemented TESL compiler, and serves as basis for the continuation of the development of the language. The compiler has been implemented in Java following the classical architecture for this kind of systems, having a four phase compilation with a Lexer, a Parser, a Semantic Analyzer and a Code Generator. Its architecture and the design decisions that lead to it have been documented. Its use has been demonstrated in an use-case in the domain of server metrics. Finally, the TESL language itself has been extended and documented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Entre los años 2004 y 2007 se hundieron por problemas de estabilidad cinco pesqueros españoles de pequeña eslora, de características parecidas, de relativamente poca edad, que habían sido construidos en un intervalo de pocos años. La mayoría de los tripulantes de esos pesqueros fallecieron o desaparecieron en esos accidentes. Este conjunto de accidentes tuvo bastante repercusión social y mediática. Entre ingenieros navales y marinos del sector de la pesca se relacionó estos accidentes con los condicionantes a los diseños de los pesqueros impuestos por la normativa de control de esfuerzo pesquero. Los accidentes fueron investigados y publicados sus correspondientes informes; en ellos no se exploró esta supuesta relación. Esta tesis pretende investigar la relación entre esos accidentes y los cambios de la normativa de esfuerzo pesquero. En la introducción se expone la normativa de control de esfuerzo pesquero analizada, se presentan datos sobre la estructura de la flota pesquera en España y su accidentalidad, y se detallan los criterios de estabilidad manejados durante el trabajo, explicando su relación con la seguridad de los pesqueros. Seguidamente se realiza un análisis estadístico de la siniestralidad en el sector de la pesca para establecer si el conjunto de accidentes estudiados supone una anomalía, o si por el contrario el conjunto de estos accidentes no es relevante desde el punto de vista estadístico. Se analiza la siniestralidad a partir de diversas bases de datos de buques pesqueros en España y se concluye que el conjunto de accidentes estudiados supone una anomalía estadística, ya que la probabilidad de ocurrencia de los cinco sucesos es muy baja considerando la frecuencia estimada de pérdidas de buques por estabilidad en el subsector de la flota pesquera en el que se encuadran los cinco buques perdidos. A continuación el trabajo se centra en la comparación de los buques accidentados con los buques pesqueros dados de baja para construir aquellos, según exige la normativa de control de esfuerzo pesquero; a estos últimos buques nos referiremos como “predecesores” de los buques accidentados. Se comparan las dimensiones principales de cada buque y de su predecesor, resultando que los buques accidentados comparten características de diseño comunes que son sensiblemente diferentes en los buques predecesores, y enlazando dichas características de diseño con los requisitos de la nueva normativa de control del esfuerzo pesquero bajo la que se construyeron estos barcos. Ello permite establecer una relación entre los accidentes y el mencionado cambio normativo. A continuación se compara el margen con que se cumplían los criterios reglamentarios de estabilidad entre los buques accidentados y los predecesores, encontrándose que en cuatro de los cinco casos los predecesores cumplían los criterios de estabilidad con mayor holgura que los buques accidentados. Los resultados obtenidos en este punto permiten establecer una relación entre el cambio de normativa de esfuerzo pesquero y la estabilidad de los buques. Los cinco buques accidentados cumplían con los criterios reglamentarios de estabilidad en vigor, lo que cuestiona la relación entre esos criterios y la seguridad. Por ello se extiende la comparativa entre pesqueros a dos nuevos campos relacionados con la estabilidad y la seguridad delos buques: • Movimientos a bordo (operatividad del buque), y • Criterios de estabilidad en condiciones meteorológicas adversas El estudio de la operatividad muestra que los buques accidentados tenían, en general, una mayor operatividad que sus predecesores, contrariamente a lo que sucedía con el cumplimiento de los criterios reglamentarios de estabilidad. Por último, se comprueba el desempeño de los diez buques en dos criterios específicos de estabilidad en caso de mal tiempo: el criterio IMO de viento y balance intenso, y un criterio de estabilidad de nueva generación, incluyendo la contribución original del autor de considerar agua en cubierta. Las tendencias observadas en estas dos comparativas son opuestas, lo que permite cuestionar la validez del último criterio sin un control exhaustivo de los parámetros de su formulación, poniendo de manifiesto la necesidad de más investigaciones sobre ese criterio antes de su adopción para uso regulatorio. El conjunto de estos resultados permite obtener una serie de conclusiones en la comparativa entre ambos conjuntos de buques pesqueros. Si bien los resultados de este trabajo no muestran que la aprobación de la nueva normativa de esfuerzo pesquero haya significado una merma general de seguridad en sectores enteros de la flota pesquera, sí se concluye que permitió que algunos diseños de buques pesqueros, posiblemente en busca de la mayor eficiencia compatible con dicha normativa, quedaran con una estabilidad precaria, poniendo de manifiesto que la relación entre seguridad y criterios de estabilidad no es unívoca, y la necesidad de que éstos evolucionen y se adapten a los nuevos diseños de buques pesqueros para continuar garantizando su seguridad. También se concluye que la estabilidad es un aspecto transversal del diseño de los buques, por lo que cualquier reforma normativa que afecte al diseño de los pesqueros o su forma de operar debería estar sujeta a evaluación por parte de las autoridades responsables de la seguridad marítima con carácter previo a su aprobación. ABSTRACT Between 2004 and 2007 five small Spanish fishing vessels sank in stability related accidents. These vessels had similar characteristics, had relatively short age, and had been built in a period of a few years. Most crewmembers of these five vessels died or disappeared in those accidents. This set of accidents had significant social and media impact. Among naval architects and seamen of the fishing sector these accidents were related to the design constraints imposed by the fishing control effort regulations. The accidents were investigated and the official reports issued; this alleged relationship was not explored. This thesis aims to investigate the relationship between those accidents and changes in fishing effort control regulations. In the introduction, the fishing effort control regulation is exposed, data of the Spanish fishing fleet structure and its accident rates are presented, and stability criteria dealt with in this work are explained, detailing its relationship with fishing vessel safety. A statistical analysis of the accident rates in the fishing sector in Spain is performed afterwards. The objective is determining whether the set of accidents studied constitute an anomaly or, on the contrary, they are not statistically relevant. Fishing vessels accident rates is analyzed from several fishing vessel databases in Spain. It is concluded that the set of studied accidents is statistically relevant, as the probability of occurrence of the five happenings is extremely low, considering the loss rates in the subsector of the Spanish fishing fleet where the studied vessels are fitted within. From this point the thesis focuses in comparing the vessels lost and the vessels that were decommissioned to build them as required by the fishing effort control regulation; these vessels will be referred to as “predecessors” of the sunk vessels. The main dimensions between each lost vessel and her predecessor are compared, leading to the conclusion that the lost vessels share design characteristics which are sensibly different from the predecessors, and linking these design characteristics with the requirements imposed by the new fishing control effort regulations. This allows establishing a relationship between the accidents and this regulation change. Then the margin in fulfilling the regulatory stability criteria among the vessels is compared, resulting, in four of the five cases, that predecessors meet the stability criteria with greater clearance than the sunk vessels. The results obtained at this point would establish a relationship between the change of fishing effort control regulation and the stability of vessels. The five lost vessels complied with the stability criteria in force, so the relation between these criteria and safety is put in question. Consequently, the comparison among vessels is extended to other fields related to safety and stability: • Motions onboard (operability), and • Specific stability criteria in rough weather The operability study shows that the lost vessels had in general greater operability than their predecessors, just the opposite as when comparing stability criteria. Finally, performance under specific rough weather stability criteria is checked. The criteria studied are the IMO Weather Criterion, and one of the 2nd generation stability criteria under development by IMO considering in this last case the presence of water on deck, which is an original contribution by the author. The observed trends in these two cases are opposite, allowing to put into question the last criterion validity without an exhaustive control of its formulation parameters; indicating that further research might be necessary before using it for regulatory purposes. The analysis of this set of results leads to some conclusions when comparing both groups of fishing vessels. While the results obtained are not conclusive in the sense that the entry into force of a new fishing effort control in 1998 caused a generalized safety reduction in whole sectors of the Spanish fishing fleet, it can be concluded that it opened the door for some vessel designs resulting with precarious stability. This evidences that the relation between safety and stability criteria is not univocal, so stability criteria needs to evolve for adapting to new fishing vessels designs so their safety is still guaranteed. It is also concluded that stability is a transversal aspect to ship design and operability, implying that any legislative reform affecting ship design or operating modes should be subjected to assessing by the authorities responsible for marine safety before being adopted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain