892 resultados para Towards Seamless Integration of Geoscience Models and Data
Resumo:
Ria deAveiro is a very complex shallow water coastal lagoon located on the northwest of Portugal. Important issues would be left unanswered without a good understanding of hydrodynamic and transport processes occurring in the lagoon. Calibration and validation of hydrodynamic, salt and heat transport models for Ria de Aveiro lagoon are presented. The calibration of the hydrodynamic model was performed adjusting the bottom friction coefficient, through the comparison between measured and predicted time series of sea surface elevation for 22 stations. Harmonic analysis was performed in order to evaluate the model's accuracy. To validate the hydrodynamic model measured and predicted SSE values were compared for 11 stations, as well as main flow direction velocities for 10 stations. The salt and heat transport models were calibrated comparing measured and predicted time series of salinity and water temperature for 7 stations, and the RMS of the difference between the series was determined. These models were validated comparing the model results with an independent field data set. The hydrodynamic and the salt and heat transport models for Ria de Aveiro were successfully calibrated and validated. They reproduce accurately the barotropic flows and can therefore adequately represent the salt and heat transport and the heat transfer processes occurring in Ria deAveiro.
Resumo:
Esta tese descreve uma framework de trabalho assente no paradigma multi-camada para analisar, modelar, projectar e optimizar sistemas de comunicação. Nela se explora uma nova perspectiva acerca da camada física que nasce das relações entre a teoria de informação, estimação, métodos probabilísticos, teoria da comunicação e codificação. Esta framework conduz a métodos de projecto para a próxima geração de sistemas de comunicação de alto débito. Além disso, a tese explora várias técnicas de camada de acesso com base na relação entre atraso e débito para o projeto de redes sem fio tolerantes a atrasos. Alguns resultados fundamentais sobre a interação entre a teoria da informação e teoria da estimação conduzem a propostas de um paradigma alternativo para a análise, projecto e optimização de sistemas de comunicação. Com base em estudos sobre a relação entre a informação recíproca e MMSE, a abordagem descrita na tese permite ultrapassar, de forma inovadora, as dificuldades inerentes à optimização das taxas de transmissão de informação confiáveis em sistemas de comunicação, e permite a exploração da atribuição óptima de potência e estruturas óptimas de pre-codificação para diferentes modelos de canal: com fios, sem fios e ópticos. A tese aborda também o problema do atraso, numa tentativa de responder a questões levantadas pela enorme procura de débitos elevados em sistemas de comunicação. Isso é feito através da proposta de novos modelos para sistemas com codificação de rede (network coding) em camadas acima da sua camada física. Em particular, aborda-se a utilização de sistemas de codificação em rede para canais que variam no tempo e são sensíveis a atrasos. Isso foi demonstrado através da proposta de um novo modelo e esquema adaptativo, cujos algoritmos foram aplicados a sistemas sem fios com desvanecimento (fading) complexo, de que são exemplos os sistemas de comunicação via satélite. A tese aborda ainda o uso de sistemas de codificação de rede em cenários de transferência (handover) exigentes. Isso é feito através da proposta de novos modelos de transmissão WiFi IEEE 801.11 MAC, que são comparados com codificação de rede, e que se demonstram possibilitar transferência sem descontinuidades. Pode assim dizer-se que esta tese, através de trabalho de análise e de propostas suportadas por simulações, defende que na concepção de sistemas de comunicação se devem considerar estratégias de transmissão e codificação que sejam não só próximas da capacidade dos canais, mas também tolerantes a atrasos, e que tais estratégias têm de ser concebidas tendo em vista características do canal e a camada física.
Resumo:
The development of mining activities over thousands of years in the region of Aljustrel is nowadays visible as a vast area of ore tailings, slag and host rocks of sulphides mineralization. The generation of acidic waters by the alteration of pyritic minerals - Acid Mine Drainage (AMD) - causes a significant impact on the river system both in the south of the village (Rib ª. Água Forte) and in the north of it (Rib ª. Água Azeda and Barranco do Farrobo), which is reflected in extremely low pH values (< 3) and high concentrations of As, Cd, Cu, Fe, Mn, Pb, Zn and sulphates. This study aimed to assess the environmental impacts extent, integrating geochemical (surface waters and stream sediments) and biological (diatoms) parameters. Three groups of sites were defined, based on sediments and water analysis, which integration with diatom data showed the same association of groups: Group 1- impacted, with acidic pH (1.9-5.1), high metal contents (0.4-1975 mg L-1) and Fe-Mg-sulphate waters, being metals more bioavailable in waters in cationic form (Me2+); mineralogically the sediments were characterized by phyllosilicates and sulphates/oxy-hydroxysulphate phases, easily solubilized, retaining a high amount of metals when precipitated; dominant taxon was Pinnularia aljustrelica (a new species); Group 2- slightly impacted, weak acid to neutral pH (5.0-6.8), metal contents not so high (0.2-25 mg L-1) and Fe-Mg-sulphate to Mg-chloride waters; dominant taxa were Brachysira neglectissima and Achnanthidium minutissimum; Group 3- unimpacted, alkaline pH (7.0-8.4), low metal contents (0-7 mg L-1) with Mg-chloride waters. In this group, metals were associated to the primary phases (e.g. sulphides), not so easily available; the existence of high chloride contents explained the presence of typical taxa of brackish/marine (e.g. Entomoneis paludosa) waters. Taxonomical aspects of the diatoms were studied (discovery of a new species: Pinnularia aljustrelica Luis, Almeida et Ector sp. nov.), as well as morphometric (size decrease of diatoms valves, as well as the appearance of deformed valves of Eunotia exigua in Group 1 and A. minutissimum in Group 2) and physiological (effective to assess the effects of metals/acidity in the photosynthetic efficiency through PAM Fluorometry) aspects. A study was carried out in an artificial river system (microcosm) that aimed to mimic Aljustrel’s extreme conditions in controlled laboratory conditions. The chronic effects of Fe, SO42- and acidity in field biofilms, inoculated in the artificial rivers, were evaluated as well as their contribution to the communities’ tolerance to metal toxicity, through acute tests with two metals (Cu and Zn). In general, the effects caused by low pH values and high concentrations of Fe and SO42- were reflected at the community level by the decrease in diversity, the predominance of acidophilic species, the decrease in photosynthetic efficiency and the increase of enzymatic (e.g. catalase, superoxide dismutase) and non-enzymatic activities (e.g. total glutathione and total phytochelatins). However, it was possible to verify that acidity performed a protective effect in the communities, upon Cu and Zn addition. A comparative study between Aljustrel mining area and New Brunswick mining area was carried out, both with similar mining and geological conditions, reflected in similar diatom communities in both mines, but in very different geographic and climatic areas.
Resumo:
Building secure systems is difficult for many reasons. This paper deals with two of the main challenges: (i) the lack of security expertise in development teams, and (ii) the inadequacy of existing methodologies to support developers who are not security experts. The security standard ISO 14508 (Common Criteria) together with secure design techniques such as UMLsec can provide the security expertise, knowledge, and guidelines that are needed. However, security expertise and guidelines are not stated explicitly in the Common Criteria. They are rather phrased in security domain terminology and difficult to understand for developers. This means that some general security and secure design expertise are required to fully take advantage of the Common Criteria and UMLsec. In addition, there is the problem of tracing security requirements and objectives into solution design,which is needed for proof of requirements fulfilment. This paper describes a security requirements engineering methodology called SecReq. SecReq combines three techniques: the Common Criteria, the heuristic requirements editorHeRA, andUMLsec. SecReqmakes systematic use of the security engineering knowledge contained in the Common Criteria and UMLsec, as well as security-related heuristics in the HeRA tool. The integrated SecReq method supports early detection of security-related issues (HeRA), their systematic refinement guided by the Common Criteria, and the ability to trace security requirements into UML design models. A feedback loop helps reusing experiencewithin SecReq and turns the approach into an iterative process for the secure system life-cycle, also in the presence of system evolution.
Resumo:
A real-time parameter estimator for the climate discrete-time dynamic models of a greenhouse located at the North of Portugal are presented. The experiments showed that the second order models identified for the air temperature and humidity achieve a close agreement between simulated and experimantal data.
Resumo:
Thesis (Ph.D.)--University of Washington, 2013
Resumo:
Transdermal biotechnologies are an ever increasing field of interest, due to the medical and pharmaceutical applications that they underlie. There are several mathematical models at use that permit a more inclusive vision of pure experimental data and even allow practical extrapolation for new dermal diffusion methodologies. However, they grasp a complex variety of theories and assumptions that allocate their use for specific situations. Models based on Fick's First Law found better use in contexts where scaled particle theory Models would be extensive in time-span but the reciprocal is also true, as context of transdermal diffusion of particular active compounds changes. This article reviews extensively the various theoretical methodologies for studying dermic diffusion in the rate limiting dermic barrier, the stratum corneum, and systematizes its characteristics, their proper context of application, advantages and limitations, as well as future perspectives.
Resumo:
The increasing and intensive integration of distributed energy resources into distribution systems requires adequate methodologies to ensure a secure operation according to the smart grid paradigm. In this context, SCADA (Supervisory Control and Data Acquisition) systems are an essential infrastructure. This paper presents a conceptual design of a communication and resources management scheme based on an intelligent SCADA with a decentralized, flexible, and intelligent approach, adaptive to the context (context awareness). The methodology is used to support the energy resource management considering all the involved costs, power flows, and electricity prices leading to the network reconfiguration. The methodology also addresses the definition of the information access permissions of each player to each resource. The paper includes a 33-bus network used in a case study that considers an intensive use of distributed energy resources in five distinct implemented operation contexts.
Resumo:
The interest in using information to improve the quality of living in large urban areas and its governance efficiency has been around for decades. Nevertheless, the improvements in Information and Communications Technology has sparked a new dynamic in academic research, usually under the umbrella term of Smart Cities. This concept of Smart City can probably be translated, in a simplified version, into cities that are lived, managed and developed in an information-saturated environment. While it makes perfect sense and we can easily foresee the benefits of such a concept, presently there are still several significant challenges that need to be tackled before we can materialize this vision. In this work we aim at providing a small contribution in this direction, which maximizes the relevancy of the available information resources. One of the most detailed and geographically relevant information resource available, for the study of cities, is the census, more specifically the data available at block level (Subsecção Estatística). In this work, we use Self-Organizing Maps (SOM) and the variant Geo-SOM to explore the block level data from the Portuguese census of Lisbon city, for the years of 2001 and 2011. We focus on gauging change, proposing ways that allow the comparison of the two time periods, which have two different underlying geographical bases. We proceed with the analysis of the data using different SOM variants, aiming at producing a two-fold portrait: one, of the evolution of Lisbon during the first decade of the XXI century, another, of how the census dataset and SOM’s can be used to produce an informational framework for the study of cities.
Resumo:
Summary : 1. Measuring health literacy in Switzerland: a review of six surveys: 1.1 Comparison of questionnaires - 1.2 Measures of health literacy in Switzerland - 1.3 Discussion of Swiss data on HL - 1.4 Description of the six surveys: 1.4.1 Current health trends and health literacy in the Swiss population (gfs-UNIVOX), 1.4.2 Nutrition, physical exercise and body weight : opinions and perceptions of the Swiss population (USI), 1.4.3 Health Literacy in Switzerland (ISPMZ), 1.4.4 Swiss Health Survey (SHS), 1.4.5 Survey of Health, Ageing and Retirement in Europe (SHARE), 1.4.6 Adult literacy and life skills survey (ALL). - 2 . Economic costs of low health literacy in Switzerland: a rough calculation. Appendix: Screenshots cost model
Resumo:
The adult hippocampus generates functional dentate granule cells (GCs) that release glutamate onto target cells in the hilus and cornus ammonis (CA)3 region, and receive glutamatergic and γ-aminobutyric acid (GABA)ergic inputs that tightly control their spiking activity. The slow and sequential development of their excitatory and inhibitory inputs makes them particularly relevant for information processing. Although they are still immature, new neurons are recruited by afferent activity and display increased excitability, enhanced activity-dependent plasticity of their input and output connections, and a high rate of synaptogenesis. Once fully mature, new GCs show all the hallmarks of neurons generated during development. In this review, we focus on how developing neurons remodel the adult dentate gyrus and discuss key aspects that illustrate the potential of neurogenesis as a mechanism for circuit plasticity and function.
Resumo:
The research presented is a qualitative case study of educators’ experiences in integrating living skills in the context of health and physical education (HPE). In using semi-structured interviews the study investigated HPE educators’ experiences and revealed their insights relative to three major themes; professional practice, challenges and support systems. Professional practice experiences detailed the use of progressive lesson planning, reflective and engaging activities, explicit student centered pedagogy as well as holistic teaching philosophies. Even further, the limited knowledge and awareness of living skills, conflicting teaching philosophies, competitive environments between subject areas and lack of time and accessibility were four major challenges that emerged throughout the data. Major supportive roles for HPE educators in the integration process included other educators, consultants, school administration, public health, parents, community programs and professional organizations. The study provides valuable discussion and suggestions for improvement of pedagogical practices in teaching living skills in the HPE setting.
Resumo:
Ma thèse est composée de trois chapitres reliés à l'estimation des modèles espace-état et volatilité stochastique. Dans le première article, nous développons une procédure de lissage de l'état, avec efficacité computationnelle, dans un modèle espace-état linéaire et gaussien. Nous montrons comment exploiter la structure particulière des modèles espace-état pour tirer les états latents efficacement. Nous analysons l'efficacité computationnelle des méthodes basées sur le filtre de Kalman, l'algorithme facteur de Cholesky et notre nouvelle méthode utilisant le compte d'opérations et d'expériences de calcul. Nous montrons que pour de nombreux cas importants, notre méthode est plus efficace. Les gains sont particulièrement grands pour les cas où la dimension des variables observées est grande ou dans les cas où il faut faire des tirages répétés des états pour les mêmes valeurs de paramètres. Comme application, on considère un modèle multivarié de Poisson avec le temps des intensités variables, lequel est utilisé pour analyser le compte de données des transactions sur les marchés financières. Dans le deuxième chapitre, nous proposons une nouvelle technique pour analyser des modèles multivariés à volatilité stochastique. La méthode proposée est basée sur le tirage efficace de la volatilité de son densité conditionnelle sachant les paramètres et les données. Notre méthodologie s'applique aux modèles avec plusieurs types de dépendance dans la coupe transversale. Nous pouvons modeler des matrices de corrélation conditionnelles variant dans le temps en incorporant des facteurs dans l'équation de rendements, où les facteurs sont des processus de volatilité stochastique indépendants. Nous pouvons incorporer des copules pour permettre la dépendance conditionnelle des rendements sachant la volatilité, permettant avoir différent lois marginaux de Student avec des degrés de liberté spécifiques pour capturer l'hétérogénéité des rendements. On tire la volatilité comme un bloc dans la dimension du temps et un à la fois dans la dimension de la coupe transversale. Nous appliquons la méthode introduite par McCausland (2012) pour obtenir une bonne approximation de la distribution conditionnelle à posteriori de la volatilité d'un rendement sachant les volatilités d'autres rendements, les paramètres et les corrélations dynamiques. Le modèle est évalué en utilisant des données réelles pour dix taux de change. Nous rapportons des résultats pour des modèles univariés de volatilité stochastique et deux modèles multivariés. Dans le troisième chapitre, nous évaluons l'information contribuée par des variations de volatilite réalisée à l'évaluation et prévision de la volatilité quand des prix sont mesurés avec et sans erreur. Nous utilisons de modèles de volatilité stochastique. Nous considérons le point de vue d'un investisseur pour qui la volatilité est une variable latent inconnu et la volatilité réalisée est une quantité d'échantillon qui contient des informations sur lui. Nous employons des méthodes bayésiennes de Monte Carlo par chaîne de Markov pour estimer les modèles, qui permettent la formulation, non seulement des densités a posteriori de la volatilité, mais aussi les densités prédictives de la volatilité future. Nous comparons les prévisions de volatilité et les taux de succès des prévisions qui emploient et n'emploient pas l'information contenue dans la volatilité réalisée. Cette approche se distingue de celles existantes dans la littérature empirique en ce sens que ces dernières se limitent le plus souvent à documenter la capacité de la volatilité réalisée à se prévoir à elle-même. Nous présentons des applications empiriques en utilisant les rendements journaliers des indices et de taux de change. Les différents modèles concurrents sont appliqués à la seconde moitié de 2008, une période marquante dans la récente crise financière.
Resumo:
Peu d’études ont évalué les caractéristiques des parcs pouvant encourager l’activité physique spécifiquement chez les jeunes. Cette étude vise à estimer la fiabilité d’un outil d’observation des parcs orienté vers les jeunes, à identifier les domaines conceptuels des parcs capturés par cet outil à l’aide d’une opérationnalisation du modèle conceptuel des parcs et de l’activité physique et à identifier différents types de parcs. Un total de 576 parcs ont été évalués en utilisant un outil d’évaluation des parcs. La fiabilité intra-juges et la fiabilité inter-juges de cet outil ont été estimées. Une analyse exploratoire par composantes principales (ACP) a été effectuée en utilisant une rotation orthogonale varimax et les variables étaient retenues si elles saturaient à ≥0.3 sur une composante. Une analyse par grappes (AG) à l’aide de la méthode de Ward a ensuite été réalisée en utilisant les composantes principales et une mesure de l’aire des parcs. L’outil était généralement fiable et l’ACP a permis d'identifier dix composantes principales qui expliquaient 60% de la variance totale. L’AG a donné un résultat de neuf grappes qui expliquaient 40% de la variance totale. Les méthodes de l’ACP et l’AG sont donc faisables avec des données de parcs. Les résultats ont été interprétés en utilisant l’opérationnalisation du modèle conceptuel.
Resumo:
La prise de décision est un processus computationnel fondamental dans de nombreux aspects du comportement animal. Le modèle le plus souvent rencontré dans les études portant sur la prise de décision est appelé modèle de diffusion. Depuis longtemps, il explique une grande variété de données comportementales et neurophysiologiques dans ce domaine. Cependant, un autre modèle, le modèle d’urgence, explique tout aussi bien ces mêmes données et ce de façon parcimonieuse et davantage encrée sur la théorie. Dans ce travail, nous aborderons tout d’abord les origines et le développement du modèle de diffusion et nous verrons comment il a été établi en tant que cadre de travail pour l’interprétation de la plupart des données expérimentales liées à la prise de décision. Ce faisant, nous relèveront ses points forts afin de le comparer ensuite de manière objective et rigoureuse à des modèles alternatifs. Nous réexaminerons un nombre d’assomptions implicites et explicites faites par ce modèle et nous mettrons alors l’accent sur certains de ses défauts. Cette analyse servira de cadre à notre introduction et notre discussion du modèle d’urgence. Enfin, nous présenterons une expérience dont la méthodologie permet de dissocier les deux modèles, et dont les résultats illustrent les limites empiriques et théoriques du modèle de diffusion et démontrent en revanche clairement la validité du modèle d'urgence. Nous terminerons en discutant l'apport potentiel du modèle d'urgence pour l'étude de certaines pathologies cérébrales, en mettant l'accent sur de nouvelles perspectives de recherche.