926 resultados para ASSESSMENT MODELS
Resumo:
Path analysis has been applied to components of the iron metabolic system with the intent of suggesting an integrated procedure for better evaluating iron nutritional status at the community level. The primary variables of interest in this study were (1) iron stores, (2) total iron-binding capacity, (3) serum ferritin, (4) serum iron, (5) transferrin saturation, and (6) hemoglobin concentration. Correlation coefficients for relationships among these variables were obtained from published literature and postulated in a series of models using measures of those variables that are feasible to include in a community nutritional survey. Models were built upon known information about the metabolism of iron and were limited by what had been reported in the literature in terms of correlation coefficients or quantitative relationships. Data were pooled from various studies and correlations of the same bivariate relationships were averaged after z- transformations. Correlation matrices were then constructed by transforming the average values back into correlation coefficients. The results of path analysis in this study indicate that hemoglobin is not a good indicator of early iron deficiency. It does not account for variance in iron stores. On the other hand, 91% of the variance in iron stores is explained by serum ferritin and total iron-binding capacity. In addition, the magnitude of the path coefficient (.78) of the serum ferritin-iron stores relationship signifies that serum ferritin is the most important predictor of iron stores in the proposed model. Finally, drawing upon known relations among variables and the amount of variance explained in path models, it is suggested that the following blood measures should be made in assessing community iron deficiency: (1) serum ferritin, (2) total iron-binding capacity, (3) serum iron, (4) transferrin saturation, and (5) hemoglobin concentration. These measures (with acceptable ranges and cut-off points) could make possible the complete evaluation of all three stages of iron deficiency in those persons surveyed at the community level. ^
Resumo:
The potential for significant human populations to experience long-term inhalation of formaldehyde and reports of symptomatology due to this exposure has led to a considerable interest in the toxicologic assessment of risk from subchronic formaldehyde exposures using animal models. Since formaldehyde inhalation depresses certain respiratory parameters in addition to its other forms of toxicity, there is a potential for the alteration of the actual dose received by the exposed individual (and the resulting toxicity) due to this respiratory effect. The respiratory responses to formaldehyde inhalation and the subsequent pattern of deposition were therefore investigated in animals that had received subchronic exposure to the compound, and the potential for changes in the formaldehyde dose received due to long-term inhalation evaluated. Male Sprague-Dawley rats were exposed to either 0, 0.5, 3, or 15 ppm formaldehyde for 6 hours/day, 5 days/week for up to 6 months. The patterns of respiratory response, deposition and the compensation mechanisms involved were then determined in a series of formaldehyde test challenges to both the upper and to the lower respiratory tracts in separate groups of subchronically exposed animals and age-specific controls (four concentration groups, two time points). In both the control and pre-exposed animals, there was a characteristic recovery of respiratory parameters initially depressed by formaldehyde inhalation to at or approaching pre-exposure levels within 10 minutes of the initiation of exposure. Also, formaldehyde deposition was found to remain very high in the upper and lower tracts after long-term exposure. Therefore, there was probably little subsequent effect on the dose received by the exposed individual that was attributable to the repeated exposures. There was a diminished initial minute volume response in test challenges of both the upper and lower tracts of animals that had received at least 16 weeks of exposure to 15 ppm, with compensatory increases in tidal volume in the upper tract and respiratory rate in the lower tract. However, this dose-related effect was probably not relevant to human risk estimation because this formaldehyde dose is in excess of that experienced by human populations. ^
Resumo:
Much of the current healthcare financial literature addresses the concern of government officials, the public, and healthcare providers regarding the need for control of health care costs. The literature suggests that attitudes of hospital department managers toward their role in financial management affects their ability to effect favorable financial results.^ There were several objectives of the dissertation: (1) To identify whether or not there exists a relationship between the attitude/role perception of hospital managers and the financial performance of their departments. (2) To compile a descriptive survey data base of key factors identified in the financial literature from individual hospitals. (3) To compile a brief descriptive survey of hospital managers' financial management background and training (both formal and informal). (4) To conduct an attitude assessment/role perception survey regarding the importance or relevance of a suggested financial management role set (i.e., issues discussed in the current literature) as viewed by the selected hospital managers and their matched administrators. (5) To propose plausible theoretical models and statistical tests of seven proposed hypotheses.^ The statistical results of a variety of methods generally suggested, for the sample population, that the null hypothesis should not be rejected concerning the relationships between a department manager's financial attitudes and role perceptions and the resultant financial performance.^ The fact that the results of this study did not suggest that there was a significant relationship which existed between role perception and financial performance does not necessarily indicate that the theories supporting such a relationship in literature are false, not that such a relationship does not exist. Several alternative theories were postulated to explain the apparent lack of statistical relationship, and suggestions for refinement and/or improvement of further research were discussed. ^
Resumo:
Patients living with a spinal cord injury (SCI) often develop chronic neuropathic pain (CNP). Unfortunately, the clinically approved, current standard of treatment, gabapentin, only provides temporary pain relief. This treatment can cause numerous adverse side effects that negatively affect the daily lives of SCI patients. There is a great need for alternative, effective treatments for SCI-dependent CNP. Minocycline, an FDA-approved antibiotic, has been widely prescribed for the treatment of acne for several decades. However, recent studies demonstrate that minocycline has neuroprotective properties in several pre-clinical rodent models of CNS trauma and disease. Pre-clinical studies also show that short-term minocycline treatment can prevent the onset of CNP when delivered during the acute stage of SCI and can also transiently attenuate established CNP when delivered briefly during the chronic stage of SCI. However, the potential to abolish or attenuate CNP via long-term administration of minocycline after SCI is unknown. The purpose of this study was to investigate the potential efficacy and safety of long-term administration of minocycline to abolish or attenuate CNP following SCI. A severe spinal contusion injury was administered on adult, male, Sprague-Dawley rats. At day 29 post-injury, I initiated a three-week treatment regimen of daily administration with minocycline (50 mg/kg), gabapentin (50 mg/kg) or saline. The minocycline treatment group demonstrated a significant reduction in below-level mechanical allodynia and above- level hyperalgesia while on their treatment regimen. After a ten-day washout period of minocycline, the animals continued to demonstrate a significant reduction in below-level mechanical allodynia and above-level hyperalgesia. However, minocycline-treated animals exhibited abnormal weight gain and hepatotoxicity compared to gapabentin-treated or vehicle-treated subjects.The results support previous findings that minocycline can attenuate CNP after SCI and suggested that minocycline can also attenuate CNP via long-term delivery of minocycline after SCI (36). The data also suggested that minocycline had a lasting effect at reducing pain symptoms. However, the adverse side effects of long-term use of minocycline should not be ignored in the rodent model. Gabapentin treatment caused a significant decrease in below-level mechanical allodynia and below-level hyperalgesia during the treatment regimen. Because gabapentin treatment has an analgesic effect at the concentration I administered, the results were expected. However, I also found that gabapentin-treated animals demonstrated a sustained reduction in pain ten days after treatment withdrawal. This result was unexpected because gabapentin has a short half-life of 1.7 hours in rodents and previous studies have demonstrated that pre-drug pain levels return shortly after withdrawal of treatment. Additionally, the gabapentin-treated animals demonstrated a significant and sustained increase in rearing events compared with all other treatment groups which suggested that gabapentin treatment was not only capable of reducing pain long-term but may also significantly improve trunk stability or improve motor function recovery.
Resumo:
Dynamic contrast agent-enhanced magnetic resonance imaging (DCE MRI) data, when analyzed with the appropriate pharmacokinetic models, have been shown to provide quantitative estimates of microvascular parameters important in characterizing the angiogenic activity of malignant tissue. These parameters consist of the whole blood volume per unit volume of tissue, v b, transport constant from the plasma to the extravascular, extracellular space (EES), k1 and the transport constant from the EES to the plasma, k2. Parameters vb and k1 are expected to correlate with microvascular density (MVD) and vascular permeability, respectively, which have been suggested to serve as surrogate markers for angiogenesis. In addition to being a marker for angiogenesis, vascular permeability is also useful in estimating tumor penetration potential of chemotherapeutic agents. ^ Histological measurements of the intratumoral microvascular environment are limited by their invasiveness and susceptibility to sampling errors. Also, MVD and vascular permeability, while useful for characterizing tumors at a single time point, have shown less utility in longitudinal studies, particularly when used to monitor the efficacy of antiangiogenic and traditional chemotherapeutic agents. These limitations led to a search for a non-invasive means of characterizing the microvascular environment of an entire tumor. ^ The overall goal of this project was to determine the utility of DCE MRI for monitoring the effect of antiangiogenic agents. Further applications of a validated DCE MRI technique include in vivo measurements of tumor microvascular characteristics to aid in determining prognosis at presentation and in estimating drug penetration. DCE MRI data were generated using single- and dual-tracer pharmacokinetic models with different molecular-weight contrast agents. The resulting pharmacokinetic parameters were compared to immunohistochemical measurements. The model and contrast agent combination yielding the best correlation between the pharmacokinetic parameters and histological measures was further evaluated in a longitudinal study to evaluate the efficacy of DCE MRI in monitoring the intratumoral microvascular environment following antiangiogenic treatment. ^
Resumo:
Past changes in North Pacific sea surface temperatures and sea-ice conditions are proposed to play a crucial role in deglacial climate development and ocean circulation but are less well known than from the North Atlantic. Here, we present new alkenone-based sea surface temperature records from the subarctic northwest Pacific and its marginal seas (Bering Sea and Sea of Okhotsk) for the time interval of the last 15 kyr, indicating millennial-scale sea surface temperature fluctuations similar to short-term deglacial climate oscillations known from Greenland ice-core records. Past changes in sea-ice distribution are derived from relative percentage of specific diatom groups and qualitative assessment of the IP25 biomarker related to sea-ice diatoms. The deglacial variability in sea-ice extent matches the sea surface temperature fluctuations. These fluctuations suggest a linkage to deglacial variations in Atlantic meridional overturning circulation and a close atmospheric coupling between the North Pacific and North Atlantic. During the Holocene the subarctic North Pacific is marked by complex sea surface temperature trends, which do not support the hypothesis of a Holocene seesaw in temperature development between the North Atlantic and the North Pacific.
Resumo:
Maritime accidents involving ships carrying passengers may pose a high risk with respect to human casualties. For effective risk mitigation, an insight into the process of risk escalation is needed. This requires a proactive approach when it comes to risk modelling for maritime transportation systems. Most of the existing models are based on historical data on maritime accidents, and thus they can be considered reactive instead of proactive. This paper introduces a systematic, transferable and proactive framework estimating the risk for maritime transportation systems, meeting the requirements stemming from the adopted formal definition of risk. The framework focuses on ship-ship collisions in the open sea, with a RoRo/Passenger ship (RoPax) being considered as the struck ship. First, it covers an identification of the events that follow a collision between two ships in the open sea, and, second, it evaluates the probabilities of these events, concluding by determining the severity of a collision. The risk framework is developed with the use of Bayesian Belief Networks and utilizes a set of analytical methods for the estimation of the risk model parameters. The model can be run with the use of GeNIe software package. Finally, a case study is presented, in which the risk framework developed here is applied to a maritime transportation system operating in the Gulf of Finland (GoF). The results obtained are compared to the historical data and available models, in which a RoPax was involved in a collision, and good agreement with the available records is found.
Resumo:
The ability to view and interact with 3D models has been happening for a long time. However, vision-based 3D modeling has only seen limited success in applications, as it faces many technical challenges. Hand-held mobile devices have changed the way we interact with virtual reality environments. Their high mobility and technical features, such as inertial sensors, cameras and fast processors, are especially attractive for advancing the state of the art in virtual reality systems. Also, their ubiquity and fast Internet connection open a path to distributed and collaborative development. However, such path has not been fully explored in many domains. VR systems for real world engineering contexts are still difficult to use, especially when geographically dispersed engineering teams need to collaboratively visualize and review 3D CAD models. Another challenge is the ability to rendering these environments at the required interactive rates and with high fidelity. In this document it is presented a virtual reality system mobile for visualization, navigation and reviewing large scale 3D CAD models, held under the CEDAR (Collaborative Engineering Design and Review) project. It’s focused on interaction using different navigation modes. The system uses the mobile device's inertial sensors and camera to allow users to navigate through large scale models. IT professionals, architects, civil engineers and oil industry experts were involved in a qualitative assessment of the CEDAR system, in the form of direct user interaction with the prototypes and audio-recorded interviews about the prototypes. The lessons learned are valuable and are presented on this document. Subsequently it was prepared a quantitative study on the different navigation modes to analyze the best mode to use it in a given situation.
Resumo:
The need to refine models for best-estimate calculations, based on good-quality experimental data, has been expressed in many recent meetings in the field of nuclear applications. The modeling needs arising in this respect should not be limited to the currently available macroscopic methods but should be extended to next-generation analysis techniques that focus on more microscopic processes. One of the most valuable databases identified for the thermalhydraulics modeling was developed by the Nuclear Power Engineering Corporation (NUPEC), Japan. From 1987 to 1995, NUPEC performed steady-state and transient critical power and departure from nucleate boiling (DNB) test series based on the equivalent full-size mock-ups. Considering the reliability not only of the measured data, but also other relevant parameters such as the system pressure, inlet sub-cooling and rod surface temperature, these test series supplied the first substantial database for the development of truly mechanistic and consistent models for boiling transition and critical heat flux. Over the last few years the Pennsylvania State University (PSU) under the sponsorship of the U.S. Nuclear Regulatory Commission (NRC) has prepared, organized, conducted and summarized the OECD/NRC Full-size Fine-mesh Bundle Tests (BFBT) Benchmark. The international benchmark activities have been conducted in cooperation with the Nuclear Energy Agency/Organization for Economic Co-operation and Development (NEA/OECD) and Japan Nuclear Energy Safety (JNES) organization, Japan. Consequently, the JNES has made available the Boiling Water Reactor (BWR) NUPEC database for the purposes of the benchmark. Based on the success of the OECD/NRC BFBT benchmark the JNES has decided to release also the data based on the NUPEC Pressurized Water Reactor (PWR) subchannel and bundle tests for another follow-up international benchmark entitled OECD/NRC PWR Subchannel and Bundle Tests (PSBT) benchmark. This paper presents an application of the joint Penn State University/Technical University of Madrid (UPM) version of the well-known subchannel code COBRA-TF, namely CTF, to the critical power and departure from nucleate boiling (DNB) exercises of the OECD/NRC BFBT and PSBT benchmarks
Resumo:
The environmental impact of systems managing large (kg) tritium amount represents a public scrutiny issue for the next coming fusion facilities as ITER and DEMO. Furthermore, potentially new dose limits imposed by international regulations (ICRP) shall impact next coming devices designs and the overall costs of fusion technology deployment. Refined environmental tritium dose impact assessment schemes are then overwhelming. Detailed assessments can be procured from the knowledge of the real boundary conditions of the primary tritium discharge phase into atmosphere (low levels) and into soils. Lagrangian dispersion models using real-time meteorological and topographic data provide a strong refinement. Advance simulation tools are being developed in this sense. The tool integrates a numerical model output records from European Centre for Medium range Weather Forecast (ECMWF) with a lagrangian atmospheric dispersion model (FLEXPART). The composite model ECMWF/FLEXTRA results can be coupled with tritium dose secondary phase pathway assessment tools. Nominal tritium discharge operational reference and selected incidental ITER-like plant systems tritium form source terms have been assumed. The realtime daily data and mesh-refined records together with lagrangian dispersion model approach provide accurate results for doses to population by inhalation or ingestion in the secondary phase
Resumo:
Métrica de calidad de video de alta definición construida a partir de ratios de referencia completa. La medida de calidad de video, en inglés Visual Quality Assessment (VQA), es uno de los mayores retos por solucionar en el entorno multimedia. La calidad de vídeo tiene un impacto altísimo en la percepción del usuario final (consumidor) de los servicios sustentados en la provisión de contenidos multimedia y, por tanto, factor clave en la valoración del nuevo paradigma denominado Calidad de la Experiencia, en inglés Quality of Experience (QoE). Los modelos de medida de calidad de vídeo se pueden agrupar en varias ramas según la base técnica que sustenta el sistema de medida, destacando en importancia los que emplean modelos psicovisuales orientados a reproducir las características del sistema visual humano, en inglés Human Visual System, del que toman sus siglas HVS, y los que, por el contrario, optan por una aproximación ingenieril en la que el cálculo de calidad está basado en la extracción de parámetros intrínsecos de la imagen y su comparación. A pesar de los avances recogidos en este campo en los últimos años, la investigación en métricas de calidad de vídeo, tanto en presencia de referencia (los modelos denominados de referencia completa), como en presencia de parte de ella (modelos de referencia reducida) e incluso los que trabajan en ausencia de la misma (denominados sin referencia), tiene un amplio camino de mejora y objetivos por alcanzar. Dentro de ellos, la medida de señales de alta definición, especialmente las utilizadas en las primeras etapas de la cadena de valor que son de muy alta calidad, son de especial interés por su influencia en la calidad final del servicio y no existen modelos fiables de medida en la actualidad. Esta tesis doctoral presenta un modelo de medida de calidad de referencia completa que hemos llamado PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), basado en la ponderación de cuatro ratios de calidad calculados a partir de características intrínsecas de la imagen. Son: El Ratio de Fidelidad, calculado mediante el gradiente morfológico o gradiente de Beucher. El Ratio de Similitud Visual, calculado mediante los puntos visualmente significativos de la imagen a través de filtrados locales de contraste. El Ratio de Nitidez, que procede de la extracción del estadístico de textura de Haralick contraste. El Ratio de Complejidad, obtenido de la definición de homogeneidad del conjunto de estadísticos de textura de Haralick PARMENIA presenta como novedad la utilización de la morfología matemática y estadísticos de Haralick como base de una métrica de medida de calidad, pues esas técnicas han estado tradicionalmente más ligadas a la teledetección y la segmentación de objetos. Además, la aproximación de la métrica como un conjunto ponderado de ratios es igualmente novedosa debido a que se alimenta de modelos de similitud estructural y otros más clásicos, basados en la perceptibilidad del error generado por la degradación de la señal asociada a la compresión. PARMENIA presenta resultados con una altísima correlación con las valoraciones MOS procedentes de las pruebas subjetivas a usuarios que se han realizado para la validación de la misma. El corpus de trabajo seleccionado procede de conjuntos de secuencias validados internacionalmente, de modo que los resultados aportados sean de la máxima calidad y el máximo rigor posible. La metodología de trabajo seguida ha consistido en la generación de un conjunto de secuencias de prueba de distintas calidades a través de la codificación con distintos escalones de cuantificación, la obtención de las valoraciones subjetivas de las mismas a través de pruebas subjetivas de calidad (basadas en la recomendación de la Unión Internacional de Telecomunicaciones BT.500), y la validación mediante el cálculo de la correlación de PARMENIA con estos valores subjetivos, cuantificada a través del coeficiente de correlación de Pearson. Una vez realizada la validación de los ratios y optimizada su influencia en la medida final y su alta correlación con la percepción, se ha realizado una segunda revisión sobre secuencias del hdtv test dataset 1 del Grupo de Expertos de Calidad de Vídeo (VQEG, Video Quality Expert Group) mostrando los resultados obtenidos sus claras ventajas. Abstract Visual Quality Assessment has been so far one of the most intriguing challenges on the media environment. Progressive evolution towards higher resolutions while increasing the quality needed (e.g. high definition and better image quality) aims to redefine models for quality measuring. Given the growing interest in multimedia services delivery, perceptual quality measurement has become a very active area of research. First, in this work, a classification of objective video quality metrics based on their underlying methodologies and approaches for measuring video quality has been introduced to sum up the state of the art. Then, this doctoral thesis describes an enhanced solution for full reference objective quality measurement based on mathematical morphology, texture features and visual similarity information that provides a normalized metric that we have called PARMENIA (PArallel Ratios MEtric from iNtrInsic features Analysis), with a high correlated MOS score. The PARMENIA metric is based on the pooling of different quality ratios that are obtained from three different approaches: Beucher’s gradient, local contrast filtering, and contrast and homogeneity Haralick’s texture features. The metric performance is excellent, and improves the current state of the art by providing a wide dynamic range that make easier to discriminate between very close quality coded sequences, especially for very high bit rates whose quality, currently, is transparent for quality metrics. PARMENIA introduces a degree of novelty against other working metrics: on the one hand, exploits the structural information variation to build the metric’s kernel, but complements the measure with texture information and a ratio of visual meaningful points that is closer to typical error sensitivity based approaches. We would like to point out that PARMENIA approach is the only metric built upon full reference ratios, and using mathematical morphology and texture features (typically used in segmentation) for quality assessment. On the other hand, it gets results with a wide dynamic range that allows measuring the quality of high definition sequences from bit rates of hundreds of Megabits (Mbps) down to typical distribution rates (5-6 Mbps), even streaming rates (1- 2 Mbps). Thus, a direct correlation between PARMENIA and MOS scores are easily constructed. PARMENIA may further enhance the number of available choices in objective quality measurement, especially for very high quality HD materials. All this results come from validation that has been achieved through internationally validated datasets on which subjective tests based on ITU-T BT.500 methodology have been carried out. Pearson correlation coefficient has been calculated to verify the accuracy of PARMENIA and its reliability.
Resumo:
Analysis of river flow using hydraulic modelling and its implications in derived environ-mental applications are inextricably connected with the way in which the river boundary shape is represented. This relationship is scale-dependent upon the modelling resolution which in turn determines the importance of a subscale performance of the model and the way subscale (surface and flow) processes are parameterised. Commonly, the subscale behaviour of the model relies upon a roughness parameterisation whose meaning depends on the dimensionality of the hydraulic model and the resolution of the topographic represen¬tation scale. This latter is, in turn, dependent on the resolution of the computational mesh as well as on the detail of measured topographic data. Flow results are affected by this interactions between scale and subscale parameterisation according to the dimensionality approach. The aim of this dissertation is the evaluation of these interactions upon hy¬draulic modelling results. Current high resolution topographic source availability induce this research which is tackled using a suitable roughness approach according to each di¬mensionality with the purpose of the interaction assessment. A 1D HEC-RAS model, a 2D raster-based diffusion-wave model with a scale-dependent distributed roughness parame-terisation and a 3D finite volume scheme with a porosity algorithm approach to incorporate complex topography have been used. Different topographic sources are assessed using a 1D scheme. LiDAR data are used to isolate the mesh resolution from the topographic content of the DEM effects upon 2D and 3D flow results. A distributed roughness parameterisation, using a roughness height approach dependent upon both mesh resolution and topographic content is developed and evaluated for the 2D scheme. Grain-size data and fractal methods are used for the reconstruction of topography with microscale information, required for some applications but not easily available. Sensitivity of hydraulic parameters to this topographic parameterisation is evaluated in a 3D scheme at different mesh resolu¬tions. Finally, the structural variability of simulated flow is analysed and related to scale interactions. Model simulations demonstrate (i) the importance of the topographic source in a 1D models; (ii) the mesh resolution approach is dominant in 2D and 3D simulations whereas in a 1D model the topographic source and even the roughness parameterisation impacts are more critical; (iii) the increment of the sensitivity to roughness parameterisa-tion in 1D and 2D schemes with detailed topographic sources and finer mesh resolutions; and (iv) the topographic content and microtopography impact throughout the vertical profile of computed 3D velocity in a depth-dependent way, whereas 2D results are not affected by topographic content variations. Finally, the spatial analysis shows that the mesh resolution controls high resolution model scale results, roughness parameterisation control 2D simulation results for a constant mesh resolution; and topographic content and micro-topography variations impacts upon the organisation of flow results depth-dependently in a 3D scheme. Resumen La topografía juega un papel fundamental en la distribución del agua y la energía en los paisajes naturales (Beven and Kirkby 1979; Wood et al. 1997). La simulación hidráulica combinada con métodos de medición del terreno por teledetección constituyen una poderosa herramienta de investigación en la comprensión del comportamiento de los flujos de agua debido a la variabilidad de la superficie sobre la que fluye. La representación e incorporación de la topografía en el esquema hidráulico tiene una importancia crucial en los resultados y determinan el desarrollo de sus aplicaciones al campo medioambiental. Cualquier simulación es una simplificación de un proceso del mundo real, y por tanto el grado de simplificación determinará el significado de los resultados simulados. Este razonamiento es particularmente difícil de trasladar a la simulación hidráulica donde aspectos de la escala tan diferentes como la escala de los procesos de flujo y de representación del contorno son considerados conjuntamente incluso en fases de parametrización (e.g. parametrización de la rugosidad). Por una parte, esto es debido a que las decisiones de escala vienen condicionadas entre ellas (e.g. la dimensionalidad del modelo condiciona la escala de representación del contorno) y por tanto interaccionan en sus resultados estrechamente. Y por otra parte, debido a los altos requerimientos numéricos y computacionales de una representación explícita de alta resolución de los procesos de flujo y discretización de la malla. Además, previo a la modelización hidráulica, la superficie del terreno sobre la que el agua fluye debe ser modelizada y por tanto presenta su propia escala de representación, que a su vez dependerá de la escala de los datos topográficos medidos con que se elabora el modelo. En última instancia, esta topografía es la que determina el comportamiento espacial del flujo. Por tanto, la escala de la topografía en sus fases de medición y modelización (resolución de los datos y representación topográfica) previas a su incorporación en el modelo hidráulico producirá a su vez un impacto que se acumulará al impacto global resultante debido a la escala computacional del modelo hidráulico y su dimensión. La comprensión de las interacciones entre las complejas geometrías del contorno y la estructura del flujo utilizando la modelización hidráulica depende de las escalas consideradas en la simplificación de los procesos hidráulicos y del terreno (dimensión del modelo, tamaño de escala computacional y escala de los datos topográficos). La naturaleza de la aplicación del modelo hidráulico (e.g. habitat físico, análisis de riesgo de inundaciones, transporte de sedimentos) determina en primer lugar la escala del estudio y por tanto el detalle de los procesos a simular en el modelo (i.e. la dimensionalidad) y, en consecuencia, la escala computacional a la que se realizarán los cálculos (i.e. resolución computacional). Esta última a su vez determina, el detalle geográfico con que deberá representarse el contorno acorde con la resolución de la malla computacional. La parametrización persigue incorporar en el modelo hidráulico la cuantificación de los procesos y condiciones físicas del sistema natural y por tanto debe incluir no solo aquellos procesos que tienen lugar a la escala de modelización, sino también aquellos que tienen lugar a un nivel subescalar y que deben ser definidos mediante relaciones de escalado con las variables modeladas explícitamente. Dicha parametrización se implementa en la práctica mediante la provisión de datos al modelo, por tanto la escala de los datos geográficos utilizados para parametrizar el modelo no sólo influirá en los resultados, sino también determinará la importancia del comportamiento subescalar del modelo y el modo en que estos procesos deban ser parametrizados (e.g. la variabilidad natural del terreno dentro de la celda de discretización o el flujo en las direcciones laterales y verticales en un modelo unidimensional). En esta tesis, se han utilizado el modelo unidimensional HEC-RAS, (HEC 1998b), un modelo ráster bidimensional de propagación de onda, (Yu 2005) y un esquema tridimensional de volúmenes finitos con un algoritmo de porosidad para incorporar la topografía, (Lane et al. 2004; Hardy et al. 2005). La geometría del contorno viene definida por la escala de representación topográfica (resolución de malla y contenido topográfico), la cual a su vez depende de la escala de la fuente cartográfica. Todos estos factores de escala interaccionan en la respuesta del modelo hidráulico a la topografía. En los últimos años, métodos como el análisis fractal y las técnicas geoestadísticas utilizadas para representar y analizar elementos geográficos (e.g. en la caracterización de superficies (Herzfeld and Overbeck 1999; Butler et al. 2001)), están promoviendo nuevos enfoques en la cuantificación de los efectos de escala (Lam et al. 2004; Atkinson and Tate 2000; Lam et al. 2006) por medio del análisis de la estructura espacial de la variable (e.g. Bishop et al. 2006; Ju et al. 2005; Myint et al. 2004; Weng 2002; Bian and Xie 2004; Southworth et al. 2006; Pozd-nyakova et al. 2005; Kyriakidis and Goodchild 2006). Estos métodos cuantifican tanto el rango de valores de la variable presentes a diferentes escalas como la homogeneidad o heterogeneidad de la variable espacialmente distribuida (Lam et al. 2004). En esta tesis, estas técnicas se han utilizado para analizar el impacto de la topografía sobre la estructura de los resultados hidráulicos simulados. Los datos de teledetección de alta resolución y técnicas GIS también están siendo utilizados para la mejor compresión de los efectos de escala en modelos medioambientales (Marceau 1999; Skidmore 2002; Goodchild 2003) y se utilizan en esta tesis. Esta tesis como corpus de investigación aborda las interacciones de esas escalas en la modelización hidráulica desde un punto de vista global e interrelacionado. Sin embargo, la estructura y el foco principal de los experimentos están relacionados con las nociones espaciales de la escala de representación en relación con una visión global de las interacciones entre escalas. En teoría, la representación topográfica debe caracterizar la superficie sobre la que corre el agua a una adecuada (conforme a la finalidad y dimensión del modelo) escala de discretización, de modo que refleje los procesos de interés. La parametrización de la rugosidad debe de reflejar los efectos de la variabilidad de la superficie a escalas de más detalle que aquellas representadas explícitamente en la malla topográfica (i.e. escala de discretización). Claramente, ambos conceptos están físicamente relacionados por un
Resumo:
The Direct Boundary Element Method (DBEM) is presented to solve the elastodynamic field equations in 2D, and a complete comprehensive implementation is given. The DBEM is a useful approach to obtain reliable numerical estimates of site effects on seismic ground motion due to irregular geological configurations, both of layering and topography. The method is based on the discretization of the classical Somigliana's elastodynamic representation equation which stems from the reciprocity theorem. This equation is given in terms of the Green's function which is the full-space harmonic steady-state fundamental solution. The formulation permits the treatment of viscoelastic media, therefore site models with intrinsic attenuation can be examined. By means of this approach, the calculation of 2D scattering of seismic waves, due to the incidence of P and SV waves on irregular topographical profiles is performed. Sites such as, canyons, mountains and valleys in irregular multilayered media are computed to test the technique. The obtained transfer functions show excellent agreement with already published results.
Resumo:
The influence of atmospheric gases and tropospheric phenomena becomes more relevant at frequencies within the THz band (100 GHz to 10 THz), severely affecting the propagation conditions. The use of radiosoundings in propagation studies is a well established measurement technique in order to collect information about the vertical structure of the atmosphere, from which gaseous and cloud attenuation can be estimated with the use of propagation models. However, some of these prediction models are not suitable to be used under rainy conditions. In the present study, a method to identify the presence of rainy conditions during radiosoundings is introduced, with the aim of filtering out these events from yearly statistics of predicted atmospheric attenuation. The detection procedure is based on the analysis of a set of parameters, some of them extracted from synoptical observations of weather (SYNOP reports) and other derived from radiosonde observations (RAOBs). The performance of the method has been evaluated under different climatic conditions, corresponding to three locations in Spain, where colocated rain gauge data were available. Rain events detected by the method have been compared with those precipitations identified by the rain gauge. The pertinence Received 26 June 2012, Accepted 31 July 2012, Scheduled 15 August 2012 * Corresponding author: Gustavo Adolfo Siles Soria (gsiles@grc.ssr.upm.es). 258 Siles et al. of the method is discussed on the basis of an analysis of cumulative distributions of total attenuation at 100 and 300 GHz. This study demonstrates that the proposed method can be useful to identify events probably associated to rainy conditions. Hence, it can be considered as a suitable algorithm in order to filter out this kind of events from annual attenuation statistics.
Resumo:
One of the key scrutiny issues of new coming energy era would be the environmental impact of fusion facilities managing one kg of tritium. The potential change of committed dose regulatory limits together with the implementation of nuclear design principles (As Low as Reasonably achievable - ALARA -, Defense in Depth -D-i-D-) for fusion facilities could strongly impact on the cost of deployment of coming fusion technology. Accurate modeling of environmental tritium transport forms (HT, HTO) for the assessment of fusion facility dosimetric impact in Accidental case appears as of major interest. This paper considers different short-term releases of tritium forms (HT and HTO) to the atmosphere from a potential fusion reactor located in the Mediterranean Basin. This work models in detail the dispersion of tritium forms and dosimetric impact of selected environmental patterns both inland and in-sea using real topography and forecast meteorological data-fields (ECMWF/FLEXPART). We explore specific values of this ratio in different levels and we examine the influence of meteorological conditions in the HTO behavior for 24 hours. For this purpose we have used a tool which consists on a coupled Lagrangian ECMWF/FLEXPART model useful to follow real time releases of tritium at 10, 30 and 60 meters together with hourly observations of wind (and in some cases precipitations) to provide a short-range approximation of tritium cloud behavior. We have assessed inhalation doses. And also HTO/HT ratios in a representative set of cases during winter 2010 and spring 2011 for the 3 air levels.