994 resultados para Modeling Methodology
Resumo:
Serious games are starting to attain a higher role as tools for learning in various contexts, but in particular in areas such as education and training. Due to its characteristics, such as rules, behavior simulation and feedback to the player's actions, serious games provide a favorable learning environment where errors can occur without real life penalty and students get instant feedback from challenges. These challenges are in accordance with the intended objectives and will self-adapt and repeat according to the student’s difficulty level. Through motivating and engaging environments, which serve as base for problem solving and simulation of different situations and contexts, serious games have a great potential to aid players developing professional skills. But, how do we certify the acquired knowledge and skills? With this work we intend to propose a methodology to establish a relationship between the game mechanics of serious games and an array of competences for certification, evaluating the applicability of various aspects in the design and development of games such as the user interfaces and the gameplay, obtaining learning outcomes within the game itself. Through the definition of game mechanics combined with the necessary pedagogical elements, the game will ensure the certification. This paper will present a matrix of generic skills, based on the European Framework of Qualifications, and the definition of the game mechanics necessary for certification on tour guide training context. The certification matrix has as reference axes: skills, knowledge and competencies, which describe what the students should learn, understand and be able to do after they complete the learning process. The guides-interpreters welcome and accompany tourists on trips and visits to places of tourist interest and cultural heritage such as museums, palaces and national monuments, where they provide various information. Tour guide certification requirements include skills and specific knowledge about foreign languages and in the areas of History, Ethnology, Politics, Religion, Geography and Art of the territory where it is inserted. These skills are communication, interpersonal relationships, motivation, organization and management. This certification process aims to validate the skills to plan and conduct guided tours on the territory, demonstrate knowledge appropriate to the context and finally match a good group leader. After defining which competences are to be certified, the next step is to delineate the expected learning outcomes, as well as identify the game mechanics associated with it. The game mechanics, as methods invoked by agents for interaction with the game world, in combination with game elements/objects allows multiple paths through which to explore the game environment and its educational process. Mechanics as achievements, appointments, progression, reward schedules or status, describe how game can be designed to affect players in unprecedented ways. In order for the game to be able to certify tour guides, the design of the training game will incorporate a set of theoretical and practical tasks to acquire skills and knowledge of various transversal themes. For this end, patterns of skills and abilities in acquiring different knowledge will be identified.
Resumo:
The application of information technologies (specially the Internet, Web 2.0 and social tools) make informal learning more visible. This kind of learning is not linked to an institution or a period of time, but it is important enough to be taken into account. On the one hand, learners should be able to communicate to the institutions they are related to, what skills they possess, whether they were achieved in a formal or informal way. On the other hand the companies and educational institutions need to have a deeper knowledge about the competencies of their staff. The TRAILER project provides a methodology supported by a technological framework to facilitate communication about informal learning between businesses, employees and learners. The paper presents the project and some of the work carried out, an exploratory analysis about how informal learning is considered and the technological framework proposed. Whilst challenges remain in terms of establishing the meaningfulness of technological engagement for employees and businesses, the continuing transformation of the social, technological and educational environment is likely to lead to greater emphasis for the effective exploitation of informal learning.
Resumo:
An analytical method using microwave-assisted extraction (MAE) and liquid chromatography (LC) with fluorescence detection (FD) for the determination of ochratoxin A (OTA) in bread samples is described. A 24 orthogonal composite design coupled with response surface methodology was used to study the influence of MAE parameters (extraction time, temperature, solvent volume, and stirring speed) in order to maximize OTA recovery. The optimized MAE conditions were the following: 25 mL of acetonitrile, 10 min of extraction, at 80 °C, and maximum stirring speed. Validation of the overall methodology was performed by spiking assays at five levels (0.1–3.00 ng/g). The quantification limit was 0.005 ng/g. The established method was then applied to 64 bread samples (wheat, maize, and wheat/maize bread) collected in Oporto region (Northern Portugal). OTAwas detected in 84 % of the samples with a maximum value of 2.87 ng/g below the European maximum limit established for OTA in cereal products of 3 ng/g.
Resumo:
This study modeled the impact on freshwater ecosystems of pharmaceuticals detected in biosolids following application on agricultural soils. The detected sulfonamides and hydrochlorothiazide displayed comparatively moderate retention in solid matrices and, therefore, higher transfer fractions from biosolids to the freshwater compartment. However, the residence times of these pharmaceuticals in freshwater were estimated to be short due to abiotic degradation processes. The non-steroidal anti-inflammatory mefenamic acid had the highest environmental impact on aquatic ecosystems and warrants further investigation. The estimation of the solid-water partitioning coefficient was generally the most influential parameter of the probabilistic comparative impact assessment. These results and the modeling approach used in this study serve to prioritize pharmaceuticals in the research effort to assess the risks and the environmental impacts on aquatic biota of these emerging pollutants.
Resumo:
Over the last few years, there has been a growing concern about the presence of pharmaceuticals in the environment. The main objective of this study was to develop and validate an SPE method using surface response methodology for the determination of ibuprofen in different types of water samples. The influence of sample pH and sample volume on the ibuprofen recovery was studied. The effect of each studied independent variable is pronounced on the dependent variable (ibuprofen recovery). Good selectivity, extraction efficiency, and precision were achieved using 600 mL of sample volume with the pH adjusted to 2.2. LC with fluorescence detection was employed. The optimized method was applied to 20 water samples from the North and South of Portugal.
Resumo:
Dissertação para obtenção do grau de Mestre em Engenharia Electrotécnica
Resumo:
Geostatistics has been successfully used to analyze and characterize the spatial variability of environmental properties. Besides giving estimated values at unsampled locations, it provides a measure of the accuracy of the estimate, which is a significant advantage over traditional methods used to assess pollution. In this work universal block kriging is novelty used to model and map the spatial distribution of salinity measurements gathered by an Autonomous Underwater Vehicle in a sea outfall monitoring campaign, with the aim of distinguishing the effluent plume from the receiving waters, characterizing its spatial variability in the vicinity of the discharge and estimating dilution. The results demonstrate that geostatistical methodology can provide good estimates of the dispersion of effluents that are very valuable in assessing the environmental impact and managing sea outfalls. Moreover, since accurate measurements of the plume’s dilution are rare, these studies might be very helpful in the future to validate dispersion models.
Resumo:
This article addresses the problem of obtaining reduced complexity models of multi-reach water delivery canals that are suitable for robust and linear parameter varying (LPV) control design. In the first stage, by applying a method known from the literature, a finite dimensional rational transfer function of a priori defined order is obtained for each canal reach by linearizing the Saint-Venant equations. Then, by using block diagrams algebra, these different models are combined with linearized gate models in order to obtain the overall canal model. In what concerns the control design objectives, this approach has the advantages of providing a model with prescribed order and to quantify the high frequency uncertainty due to model approximation. A case study with a 3-reach canal is presented, and the resulting model is compared with experimental data. © 2014 IEEE.
Resumo:
Desertification is a critical issue for Mediterranean drylands. Climate change is expected to aggravate its extension and severity by reinforcing the biophysical driving forces behind desertification processes: hydrology, vegetation cover and soil erosion. The main objective of this thesis is to assess the vulnerability of Mediterranean watersheds to climate change, by estimating impacts on desertification drivers and the watersheds’ resilience to them. To achieve this objective, a modeling framework capable of analyzing the processes linking climate and the main drivers is developed. The framework couples different models adapted to different spatial and temporal scales. A new model for the event scale is developed, the MEFIDIS model, with a focus on the particular processes governing Mediterranean watersheds. Model results are compared with desertification thresholds to estimate resilience. This methodology is applied to two contrasting study areas: the Guadiana and the Tejo, which currently present a semi-arid and humid climate. The main conclusions taken from this work can be summarized as follows: • hydrological processes show a high sensitivity to climate change, leading to a significant decrease in runoff and an increase in temporal variability; • vegetation processes appear to be less sensitive, with negative impacts for agricultural species and forests, and positive impacts for Mediterranean species; • changes to soil erosion processes appear to depend on the balance between changes to surface runoff and vegetation cover, itself governed by relationship between changes to temperature and rainfall; • as the magnitude of changes to climate increases, desertification thresholds are surpassed in a sequential way, starting with the watersheds’ ability to sustain current water demands and followed by the vegetation support capacity; • the most important thresholds appear to be a temperature increase of +3.5 to +4.5 ºC and a rainfall decrease of -10 to -20 %; • rainfall changes beyond this threshold could lead to severe water stress occurring even if current water uses are moderated, with droughts occurring in 1 out of 4 years; • temperature changes beyond this threshold could lead to a decrease in agricultural yield accompanied by an increase in soil erosion for croplands; • combined changes of temperature and rainfall beyond the thresholds could shift both systems towards a more arid state, leading to severe water stresses and significant changes to the support capacity for current agriculture and natural vegetation in both study areas.
Resumo:
This paper focuses on a novel formalization for assessing the five parameter modeling of a photovoltaic cell. An optimization procedure is used as a feasibility problem to find the parameters tuned at the open circuit, maximum power, and short circuit points in order to assess the data needed for plotting the I-V curve. A comparison with experimental results is presented for two monocrystalline PV modules.
Resumo:
We prove existence, uniqueness, and stability of solutions of the prescribed curvature problem (u'/root 1 + u'(2))' = au - b/root 1 + u'(2) in [0, 1], u'(0) = u(1) = 0, for any given a > 0 and b > 0. We also develop a linear monotone iterative scheme for approximating the solution. This equation has been proposed as a model of the corneal shape in the recent paper (Okrasinski and Plociniczak in Nonlinear Anal., Real World Appl. 13:1498-1505, 2012), where a simplified version obtained by partial linearization has been investigated.
Resumo:
Não existe uma definição única de processo de memória de longo prazo. Esse processo é geralmente definido como uma série que possui um correlograma decaindo lentamente ou um espectro infinito de frequência zero. Também se refere que uma série com tal propriedade é caracterizada pela dependência a longo prazo e por não periódicos ciclos longos, ou que essa característica descreve a estrutura de correlação de uma série de longos desfasamentos ou que é convencionalmente expressa em termos do declínio da lei-potência da função auto-covariância. O interesse crescente da investigação internacional no aprofundamento do tema é justificado pela procura de um melhor entendimento da natureza dinâmica das séries temporais dos preços dos ativos financeiros. Em primeiro lugar, a falta de consistência entre os resultados reclama novos estudos e a utilização de várias metodologias complementares. Em segundo lugar, a confirmação de processos de memória longa tem implicações relevantes ao nível da (1) modelação teórica e econométrica (i.e., dos modelos martingale de preços e das regras técnicas de negociação), (2) dos testes estatísticos aos modelos de equilíbrio e avaliação, (3) das decisões ótimas de consumo / poupança e de portefólio e (4) da medição de eficiência e racionalidade. Em terceiro lugar, ainda permanecem questões científicas empíricas sobre a identificação do modelo geral teórico de mercado mais adequado para modelar a difusão das séries. Em quarto lugar, aos reguladores e gestores de risco importa saber se existem mercados persistentes e, por isso, ineficientes, que, portanto, possam produzir retornos anormais. O objetivo do trabalho de investigação da dissertação é duplo. Por um lado, pretende proporcionar conhecimento adicional para o debate da memória de longo prazo, debruçando-se sobre o comportamento das séries diárias de retornos dos principais índices acionistas da EURONEXT. Por outro lado, pretende contribuir para o aperfeiçoamento do capital asset pricing model CAPM, considerando uma medida de risco alternativa capaz de ultrapassar os constrangimentos da hipótese de mercado eficiente EMH na presença de séries financeiras com processos sem incrementos independentes e identicamente distribuídos (i.i.d.). O estudo empírico indica a possibilidade de utilização alternativa das obrigações do tesouro (OT’s) com maturidade de longo prazo no cálculo dos retornos do mercado, dado que o seu comportamento nos mercados de dívida soberana reflete a confiança dos investidores nas condições financeiras dos Estados e mede a forma como avaliam as respetiva economias com base no desempenho da generalidade dos seus ativos. Embora o modelo de difusão de preços definido pelo movimento Browniano geométrico gBm alegue proporcionar um bom ajustamento das séries temporais financeiras, os seus pressupostos de normalidade, estacionariedade e independência das inovações residuais são adulterados pelos dados empíricos analisados. Por isso, na procura de evidências sobre a propriedade de memória longa nos mercados recorre-se à rescaled-range analysis R/S e à detrended fluctuation analysis DFA, sob abordagem do movimento Browniano fracionário fBm, para estimar o expoente Hurst H em relação às séries de dados completas e para calcular o expoente Hurst “local” H t em janelas móveis. Complementarmente, são realizados testes estatísticos de hipóteses através do rescaled-range tests R/S , do modified rescaled-range test M - R/S e do fractional differencing test GPH. Em termos de uma conclusão única a partir de todos os métodos sobre a natureza da dependência para o mercado acionista em geral, os resultados empíricos são inconclusivos. Isso quer dizer que o grau de memória de longo prazo e, assim, qualquer classificação, depende de cada mercado particular. No entanto, os resultados gerais maioritariamente positivos suportam a presença de memória longa, sob a forma de persistência, nos retornos acionistas da Bélgica, Holanda e Portugal. Isto sugere que estes mercados estão mais sujeitos a maior previsibilidade (“efeito José”), mas também a tendências que podem ser inesperadamente interrompidas por descontinuidades (“efeito Noé”), e, por isso, tendem a ser mais arriscados para negociar. Apesar da evidência de dinâmica fractal ter suporte estatístico fraco, em sintonia com a maior parte dos estudos internacionais, refuta a hipótese de passeio aleatório com incrementos i.i.d., que é a base da EMH na sua forma fraca. Atendendo a isso, propõem-se contributos para aperfeiçoamento do CAPM, através da proposta de uma nova fractal capital market line FCML e de uma nova fractal security market line FSML. A nova proposta sugere que o elemento de risco (para o mercado e para um ativo) seja dado pelo expoente H de Hurst para desfasamentos de longo prazo dos retornos acionistas. O expoente H mede o grau de memória de longo prazo nos índices acionistas, quer quando as séries de retornos seguem um processo i.i.d. não correlacionado, descrito pelo gBm(em que H = 0,5 , confirmando- se a EMH e adequando-se o CAPM), quer quando seguem um processo com dependência estatística, descrito pelo fBm(em que H é diferente de 0,5, rejeitando-se a EMH e desadequando-se o CAPM). A vantagem da FCML e da FSML é que a medida de memória de longo prazo, definida por H, é a referência adequada para traduzir o risco em modelos que possam ser aplicados a séries de dados que sigam processos i.i.d. e processos com dependência não linear. Então, estas formulações contemplam a EMH como um caso particular possível.
Resumo:
Modeling the fundamental performance limits of Wireless Sensor Networks (WSNs) is of paramount importance to understand their behavior under the worst-case conditions and to make the appropriate design choices. This is particular relevant for time-sensitive WSN applications, where the timing behavior of the network protocols (message transmission must respect deadlines) impacts on the correct operation of these applications. In that direction this paper contributes with a methodology based on Network Calculus, which enables quick and efficient worst-case dimensioning of static or even dynamically changing cluster-tree WSNs where the data sink can either be static or mobile. We propose closed-form recurrent expressions for computing the worst-case end-to-end delays, buffering and bandwidth requirements across any source-destination path in a cluster-tree WSN. We show how to apply our methodology to the case of IEEE 802.15.4/ZigBee cluster-tree WSNs. Finally, we demonstrate the validity and analyze the accuracy of our methodology through a comprehensive experimental study using commercially available technology, namely TelosB motes running TinyOS.
Resumo:
Modeling the fundamental performance limits of Wireless Sensor Networks (WSNs) is of paramount importance to understand their behavior under worst-case conditions and to make the appropriate design choices. In that direction this paper contributes with an analytical methodology for modeling cluster-tree WSNs where the data sink can either be static or mobile. We assess the validity and pessimism of analytical model by comparing the worst-case results with the values measured through an experimental test-bed based on Commercial-Off- The-Shelf (COTS) technologies, namely TelosB motes running TinyOS.
Resumo:
The IEEE 802.15.4 protocol has the ability to support time-sensitive Wireless Sensor Network (WSN) applications due to the Guaranteed Time Slot (GTS) Medium Access Control mechanism. Recently, several analytical and simulation models of the IEEE 802.15.4 protocol have been proposed. Nevertheless, currently available simulation models for this protocol are both inaccurate and incomplete, and in particular they do not support the GTS mechanism. In this paper, we propose an accurate OPNET simulation model, with focus on the implementation of the GTS mechanism. The motivation that has driven this work is the validation of the Network Calculus based analytical model of the GTS mechanism that has been previously proposed and to compare the performance evaluation of the protocol as given by the two alternative approaches. Therefore, in this paper we contribute an accurate OPNET model for the IEEE 802.15.4 protocol. Additionally, and probably more importantly, based on the simulation model we propose a novel methodology to tune the protocol parameters such that a better performance of the protocol can be guaranteed, both concerning maximizing the throughput of the allocated GTS as well as concerning minimizing frame delay.