676 resultados para real-world


Relevância:

60.00% 60.00%

Publicador:

Resumo:

En els darrers anys, la criptografia amb corbes el.líptiques ha adquirit una importància creixent, fins a arribar a formar part en la actualitat de diferents estàndards industrials. Tot i que s'han dissenyat variants amb corbes el.líptiques de criptosistemes clàssics, com el RSA, el seu màxim interès rau en la seva aplicació en criptosistemes basats en el Problema del Logaritme Discret, com els de tipus ElGamal. En aquest cas, els criptosistemes el.líptics garanteixen la mateixa seguretat que els construïts sobre el grup multiplicatiu d'un cos finit primer, però amb longituds de clau molt menor. Mostrarem, doncs, les bones propietats d'aquests criptosistemes, així com els requeriments bàsics per a que una corba sigui criptogràficament útil, estretament relacionat amb la seva cardinalitat. Revisarem alguns mètodes que permetin descartar corbes no criptogràficament útils, així com altres que permetin obtenir corbes bones a partir d'una de donada. Finalment, descriurem algunes aplicacions, com són el seu ús en Targes Intel.ligents i sistemes RFID, per concloure amb alguns avenços recents en aquest camp.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In recent years, Semantic Web (SW) research has resulted in significant outcomes. Various industries have adopted SW technologies, while the ‘deep web’ is still pursuing the critical transformation point, in which the majority of data found on the deep web will be exploited through SW value layers. In this article we analyse the SW applications from a ‘market’ perspective. We are setting the key requirements for real-world information systems that are SW-enabled and we discuss the major difficulties for the SW uptake that has been delayed. This article contributes to the literature of SW and knowledge management providing a context for discourse towards best practices on SW-based information systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sudoku problems are some of the most known and enjoyed pastimes, with a never diminishing popularity, but, for the last few years those problems have gone from an entertainment to an interesting research area, a twofold interesting area, in fact. On the one side Sudoku problems, being a variant of Gerechte Designs and Latin Squares, are being actively used for experimental design, as in [8, 44, 39, 9]. On the other hand, Sudoku problems, as simple as they seem, are really hard structured combinatorial search problems, and thanks to their characteristics and behavior, they can be used as benchmark problems for refining and testing solving algorithms and approaches. Also, thanks to their high inner structure, their study can contribute more than studies of random problems to our goal of solving real-world problems and applications and understanding problem characteristics that make them hard to solve. In this work we use two techniques for solving and modeling Sudoku problems, namely, Constraint Satisfaction Problem (CSP) and Satisfiability Problem (SAT) approaches. To this effect we define the Generalized Sudoku Problem (GSP), where regions can be of rectangular shape, problems can be of any order, and solution existence is not guaranteed. With respect to the worst-case complexity, we prove that GSP with block regions of m rows and n columns with m = n is NP-complete. For studying the empirical hardness of GSP, we define a series of instance generators, that differ in the balancing level they guarantee between the constraints of the problem, by finely controlling how the holes are distributed in the cells of the GSP. Experimentally, we show that the more balanced are the constraints, the higher the complexity of solving the GSP instances, and that GSP is harder than the Quasigroup Completion Problem (QCP), a problem generalized by GSP. Finally, we provide a study of the correlation between backbone variables – variables with the same value in all the solutions of an instance– and hardness of GSP.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Random problem distributions have played a key role in the study and design of algorithms for constraint satisfaction and Boolean satisfiability, as well as in ourunderstanding of problem hardness, beyond standard worst-case complexity. We consider random problem distributions from a highly structured problem domain that generalizes the Quasigroup Completion problem (QCP) and Quasigroup with Holes (QWH), a widely used domain that captures the structure underlying a range of real-world applications. Our problem domain is also a generalization of the well-known Sudoku puz- zle: we consider Sudoku instances of arbitrary order, with the additional generalization that the block regions can have rectangular shape, in addition to the standard square shape. We evaluate the computational hardness of Generalized Sudoku instances, for different parameter settings. Our experimental hardness results show that we can generate instances that are considerably harder than QCP/QWH instances of the same size. More interestingly, we show the impact of different balancing strategies on problem hardness. We also provide insights into backbone variables in Generalized Sudoku instances and how they correlate to problem hardness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on fi eld data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unifi cation of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call"plausibility"- including the fi delity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram"s 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Työn tavoitteena oli löytää tarkka menetelmä kampiakselin vaurioitumisriskin laskentaan vertailemalla eri laskentamenetelmiä. Lopuksi suoritettiin simulointi monikappalejärjestelmälle käyttäen elastisia malleja todellisista rakenteista. Simulointiohjelmana käytettiin AVL:n kehittämää Excite:ia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Engineered nanomaterials (ENMs) exhibit special physicochemical properties and thus are finding their way into an increasing number of industries, enabling products with improved properties. Their increased use brings a greater likelihood of exposure to the nanoparticles (NPs) that could be released during the life cycle of nano-abled products. The field of nanotoxicology has emerged as a consequence of the development of these novel materials, and it has gained ever more attention due to the urgent need to gather information on exposure to them and to understand the potential hazards they engender. However, current studies on nanotoxicity tend to focus on pristine ENMs, and they use these toxicity results to generalize risk assessments on human exposure to NPs. ENMs released into the environment can interact with their surroundings, change characteristics and exhibit toxicity effects distinct from those of pristine ENMs. Furthermore, NPs' large surface areas provide extra-large potential interfaces, thus promoting more significant interactions between NPs and other co-existing species. In such processes, other species can attach to a NP's surface and modify its surface functionality, in addition to the toxicity in normally exhibits. One particular occupational health scenario involves NPs and low-volatile organic compounds (LVOC), a common type of pollutant existing around many potential sources of NPs. LVOC can coat a NP's surface and then dominate its toxicity. One important mechanism in nanotoxicology is the creation of reactive oxygen species (ROS) on a NP's surface; LVOC can modify the production of these ROS. In summary, nanotoxicity research should not be limited to the toxicity of pristine NPs, nor use their toxicity to evaluate the health effects of exposure to environmental NPs. Instead, the interactions which NPs have with other environmental species should also be considered and researched. The potential health effects of exposure to NPs should be derived from these real world NPs with characteristics modified by the environment and their distinct toxicity. Failure to suitably address toxicity results could lead to an inappropriate treatment of nano- release, affect the environment and public health and put a blemish on the development of sustainable nanotechnologies as a whole. The main objective of this thesis is to demonstrate a process for coating NP surfaces with LVOC using a well-controlled laboratory design and, with regard to these NPs' capacity to generate ROS, explore the consequences of changing particle toxicity. The dynamic coating system developed yielded stable and replicable coating performance, simulating an important realistic scenario. Clear changes in the size distribution of airborne NPs were observed using a scanning mobility particle sizer, were confirmed using both liquid nanotracking analyses and transmission electron microscopy (TEM) imaging, and were verified thanks to the LVOC coating. Coating thicknesses corresponded to the amount of coating material used and were controlled using the parameters of the LVOC generator. The capacity of pristine silver NPs (Ag NPs) to generate ROS was reduced when they were given a passive coating of inert paraffin: this coating blocked the reactive zones on the particle surfaces. In contrast, a coating of active reduced-anthraquinone contributed to redox reactions and generated ROS itself, despite the fact that ROS generation due to oxidation by Ag NPs themselves was quenched. Further objectives of this thesis included development of ROS methodology and the analysis of ROS case studies. Since the capacity of NPs to create ROS is an important effect in nanotoxicity, we attempted to refine and standardize the use of 2'7-dichlorodihydrofluorescin (DCFH) as a chemical tailored for the characterization of NPs' capacity for ROS generation. Previous studies had reported a wide variety of results, which were due to a number of insufficiently well controlled factors. We therefore cross-compared chemicals and concentrations, explored ways of dispersing NP samples in liquid solutions, identified sources of contradictions in the literature and investigated ways of reducing artificial results. The most robust results were obtained by sonicating an optimal sample of NPs in a DCFH-HRP solution made of 5,M DCFH and 0.5 unit/ml horseradish peroxidase (HRP). Our findings explained how the major reasons for previously conflicting results were the different experimental approaches used and the potential artifacts appearing when using high sample concentrations. Applying our advanced DCFH protocol with other physicochemical characterizations and biological analyses, we conducted several case studies, characterizing aerosols and NP samples. Exposure to aged brake wear dust engenders a risk of potential deleterious health effects in occupational scenarios. We performed microscopy and elemental analyses, as well as ROS measurements, with acellular and cellular DCFH assays. TEM images revealed samples to be heterogeneous mixtures with few particles in the nano-scale. Metallic and non-metallic elements were identified, primarily iron, carbon and oxygen. Moderate amounts of ROS were detected in the cell-free fluorescent tests; however, exposed cells were not dramatically activated. In addition to their highly aged state due to oxidation, the reason aged brake wear samples caused less oxidative stress than fresh brake wear samples may be because of their larger size and thus smaller relative reactive surface area. Other case studies involving welding fumes and differently charged NPs confirmed the performance of our DCFH assay and found ROS generation linked to varying characteristics, especially the surface functionality of the samples. Les nanomatériaux manufacturés (ENM) présentent des propriétés physico-chimiques particulières et ont donc trouvés des applications dans un nombre croissant de secteurs, permettant de réaliser des produits ayant des propriétés améliorées. Leur utilisation accrue engendre un plus grand risque pour les êtres humains d'être exposés à des nanoparticules (NP) qui sont libérées au long de leur cycle de vie. En conséquence, la nanotoxicologie a émergé et gagné de plus en plus d'attention dû à la nécessité de recueillir les renseignements nécessaires sur l'exposition et les risques associés à ces nouveaux matériaux. Cependant, les études actuelles sur la nanotoxicité ont tendance à se concentrer sur les ENM et utiliser ces résultats toxicologiques pour généraliser l'évaluation des risques sur l'exposition humaine aux NP. Les ENM libérés dans l'environnement peuvent interagir avec l'environnement, changeant leurs caractéristiques, et montrer des effets de toxicité distincts par rapport aux ENM originaux. Par ailleurs, la grande surface des NP fournit une grande interface avec l'extérieur, favorisant les interactions entre les NP et les autres espèces présentes. Dans ce processus, d'autres espèces peuvent s'attacher à la surface des NP et modifier leur fonctionnalité de surface ainsi que leur toxicité. Un scénario d'exposition professionnel particulier implique à la fois des NP et des composés organiques peu volatils (LVOC), un type commun de polluant associé à de nombreuses sources de NP. Les LVOC peuvent se déposer sur la surface des NP et donc dominer la toxicité globale de la particule. Un mécanisme important en nanotoxicologie est la création d'espèces réactives d'oxygène (ROS) sur la surface des particules, et les LVOC peuvent modifier cette production de ROS. En résumé, la recherche en nanotoxicité ne devrait pas être limitée à la toxicité des ENM originaux, ni utiliser leur toxicité pour évaluer les effets sur la santé de l'exposition aux NP de l'environnement; mais les interactions que les NP ont avec d'autres espèces environnementales doivent être envisagées et étudiées. Les effets possibles sur la santé de l'exposition aux NP devraient être dérivés de ces NP aux caractéristiques modifiées et à la toxicité distincte. L'utilisation de résultats de toxicité inappropriés peut conduire à une mauvaise prise en charge de l'exposition aux NP, de détériorer l'environnement et la santé publique et d'entraver le développement durable des industries de la nanotechnologie dans leur ensemble. L'objectif principal de cette thèse est de démontrer le processus de déposition des LVOC sur la surface des NP en utilisant un environnement de laboratoire bien contrôlé et d'explorer les conséquences du changement de toxicité des particules sur leur capacité à générer des ROS. Le système de déposition dynamique développé a abouti à des performances de revêtement stables et reproductibles, en simulant des scénarios réalistes importants. Des changements clairs dans la distribution de taille des NP en suspension ont été observés par spectrométrie de mobilité électrique des particules, confirmé à la fois par la méthode dite liquid nanotracking analysis et par microscopie électronique à transmission (MET), et a été vérifié comme provenant du revêtement par LVOC. La correspondance entre l'épaisseur de revêtement et la quantité de matériau de revêtement disponible a été démontré et a pu être contrôlé par les paramètres du générateur de LVOC. La génération de ROS dû aux NP d'argent (Ag NP) a été diminuée par un revêtement passif de paraffine inerte bloquant les zones réactives à la surface des particules. Au contraire, le revêtement actif d'anthraquinone réduit a contribué aux réactions redox et a généré des ROS, même lorsque la production de ROS par oxydation des Ag NP avec l'oxygène a été désactivé. Les objectifs associés comprennent le développement de la méthodologie et des études de cas spécifique aux ROS. Etant donné que la capacité des NP à générer des ROS contribue grandement à la nanotoxicité, nous avons tenté de définir un standard pour l'utilisation de 27- dichlorodihydrofluorescine (DCFH) adapté pour caractériser la génération de ROS par les NP. Des etudes antérieures ont rapporté une grande variété de résultats différents, ce qui était dû à un contrôle insuffisant des plusieurs facteurs. Nous avons donc comparé les produits chimiques et les concentrations utilisés, exploré les moyens de dispersion des échantillons HP en solution liquide, investigué les sources de conflits identifiées dans les littératures et étudié les moyens de réduire les résultats artificiels. De très bon résultats ont été obtenus par sonication d'une quantité optimale d'échantillons de NP en solution dans du DCFH-HRP, fait de 5 nM de DCFH et de 0,5 unité/ml de Peroxydase de raifort (HRP). Notre étude a démontré que les principales raisons causant les conflits entre les études précédemment conduites dans la littérature étaient dues aux différentes approches expérimentales et à des artefacts potentiels dus à des concentrations élevées de NP dans les échantillons. Utilisant notre protocole DCFH avancé avec d'autres caractérisations physico-chimiques et analyses biologiques, nous avons mené plusieurs études de cas, caractérisant les échantillons d'aérosols et les NP. La vielle poussière de frein en particulier présente un risque élevé d'exposition dans les scénarios professionnels, avec des effets potentiels néfastes sur la santé. Nous avons effectué des analyses d'éléments et de microscopie ainsi que la mesure de ROS avec DCFH cellulaire et acellulaire. Les résultats de MET ont révélé que les échantillons se présentent sous la forme de mélanges de particules hétérogènes, desquels une faible proportion se trouve dans l'échelle nano. Des éléments métalliques et non métalliques ont été identifiés, principalement du fer, du carbone et de l'oxygène. Une quantité modérée de ROS a été détectée dans le test fluorescent acellulaire; cependant les cellules exposées n'ont pas été très fortement activées. La raison pour laquelle les échantillons de vielle poussière de frein causent un stress oxydatif inférieur par rapport à la poussière de frein nouvelle peut-être à cause de leur plus grande taille engendrant une surface réactive proportionnellement plus petite, ainsi que leur état d'oxydation avancé diminuant la réactivité. D'autres études de cas sur les fumées de soudage et sur des NP différemment chargées ont confirmé la performance de notre test DCFH et ont trouvé que la génération de ROS est liée à certaines caractéristiques, notamment la fonctionnalité de surface des échantillons.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Markets, in the real world, are not efficient zero-sum games where hypotheses of the CAPM are fulfilled. Then, it is easy to conclude the market portfolio is not located on Markowitz"s efficient frontier, and passive investments (and indexing) are not optimal but biased. In this paper, we define and analyze biases suffered by passive investors: the sample, construction, efficiency and active biases and tracking error are presented. We propose Minimum Risk Indices (MRI) as an alternative to deal with to market index biases, and to provide investors with portfolios closer to the efficient frontier, that is, more optimal investment possibilities. MRI (using a Parametric Value-at-Risk Minimization approach) are calculated for three stock markets achieving interesting results. Our indices are less risky and more profitable than current Market Indices in the Argentinean and Spanish markets, facing that way the Efficient Market Hypothesis. Two innovations must be outlined: an error dimension has been included in the backtesting and the Sharpe"s Ratio has been used to select the"best" MRI

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El retorno a la vida cotidiana para una persona con lesión medular después del periodo de rehabilitación en régimen hospitalario, es un proceso difícil no exento de dificultades y nuevos retos personales. En este trabajo nos planteamos identificar aquellos factores más relevantes que contribuyen a mejorar su calidad de vida, desde la perspectiva de las propias personas afectadas. Hemos realizado dos grupos de discusión: uno formado por 12 personas con paraplejia y otro formado por 6 personas con tetraplejia. El análisis de contenido realizado indica que, para los participantes existen dos dimensiones relacionadas con su percepción de calidad de vida una vez salen del centro de rehabilitación: a) necesidad de atención al entorno más próximo y b) preparación para el mundo real. Concluimos señalando la importancia de realizar programas de rehabilitación integral, que incluyan, rehabilitación física, aprendizaje de habilidades que posibiliten el máximo de independencia y autonomía personal y trabajo de apoyo a la familia

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Recently, it has been suggested that the type of stent used in primary percutaneous coronary interventions (pPCI) might impact upon the outcomes of patients with acute myocardial infarction (AMI). Indeed, drug-eluting stents (DES) reduce neointimal hyperplasia compared to bare-metal stents (BMS). Moreover, the later generation DES, due to its biocompatible polymer coatings and stent design, allows for greater deliverability, improved endothelial healing and therefore less restenosis and thrombus generation. However, data on the safety and performance of DES in large cohorts of AMI is still limited. AIM: To compare the early outcome of DES vs. BMS in AMI patients. METHODS: This was a prospective, multicentre analysis containing patients from 64 hospitals in Switzerland with AMI undergoing pPCI between 2005 and 2013. The primary endpoint was in-hospital all-cause death, whereas the secondary endpoint included a composite measure of major adverse cardiac and cerebrovascular events (MACCE) of death, reinfarction, and cerebrovascular event. RESULTS: Of 20,464 patients with a primary diagnosis of AMI and enrolled to the AMIS Plus registry, 15,026 were referred for pPCI and 13,442 received stent implantation. 10,094 patients were implanted with DES and 2,260 with BMS. The overall in-hospital mortality was significantly lower in patients with DES compared to those with BMS implantation (2.6% vs. 7.1%,p < 0.001). The overall in-hospital MACCE after DES was similarly lower compared to BMS (3.5% vs. 7.6%, p < 0.001). After adjusting for all confounding covariables, DES remained an independent predictor for lower in-hospital mortality (OR 0.51,95% CI 0.40-0.67, p < 0.001). Since groups differed as regards to baseline characteristics and pharmacological treatment, we performed a propensity score matching (PSM) to limit potential biases. Even after the PSM, DES implantation remained independently associated with a reduced risk of in-hospital mortality (adjusted OR 0.54, 95% CI 0.39-0.76, p < 0.001). CONCLUSIONS: In unselected patients from a nationwide, real-world cohort, we found DES, compared to BMS, was associated with lower in-hospital mortality and MACCE. The identification of optimal treatment strategies of patients with AMI needs further randomised evaluation; however, our findings suggest a potential benefit with DES.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Les tâches nécessitant des manipulations et des transformations de figures géométriques et de formes, comme les tâches de rotation mentale, donnent lieu à des différences de performance entre hommes et femmes qui restent intrigantes. Plusieurs hypothèses ont été proposées pour expliquer ces différences. La plus récurrente porte sur les différences de stratégie globale vs locale utilisées pour traiter l'information. Bien que cette conjecture soit intéressante, elle reste difficile à opérationnaliser car elle englobe tous les mécanismes cognitifs (acquisition, conservation et récupération de l'information). Ce travail prend la forme d'un retour aux sources dans la mesure où il se base sur des recherches anciennes qui ont montré que les hommes perçoivent significativement mieux que les femmes la verticale et l'horizontale. Il teste l'hypothèse selon laquelle les hommes, comparativement aux femmes, présentent une plus forte indépendance au champ perceptif visuel et sont donc plus susceptibles d'utiliser la verticalité et l'horizontalité pour résoudre une tâche de rotation mentale. Une première série d'expériences s'est penchée sur la perception spatiale pour évaluer son impact sur la résolution d'une tâche impliquant la rotation mentale. Les résultats ont montré que seuls les hommes se référaient à la verticalité et à l'horizontalité pour résoudre la tâche. Une seconde série d'expériences ont investigué l'effet de la présence, ou absence, d'axes directionnels directement liés à une tâche de rotation mentale. Elles ont été menées également en environnement réel afin d'évaluer comment le déplacement actif ou passif, correspondant à un changement de perspective en lieu et place d'une rotation mentale, module la performance des hommes et des femmes. Les résultats n'ont pas mis en évidence de différence sexuelle. Notre hypothèse est vérifiée puisque c'est uniquement lorsque la tâche ne présente pas d'axes orthogonaux évidents mais implicites que seuls les hommes, plus indépendants au champ perceptif visuel que les femmes, utilisent la perception de la verticalité et de l'horizontalité pour améliorer leur compétence en rotation mentale. -- Tasks that require manipulation and transformation of geometric shapes and forms, like tasks of mental rotation and give rise to differences in performance between men and women, remain intriguing. Several hypotheses have been proposed to explain these differences. The most recurring hypothesis addresses differences in global versus local strategies for processing information. While this conjecture is interesting, it remains difficult to study because it encompasses all the cognitive mechanisms (acquisition, retention and output). This work returns to the sources, which are based on earlier research that shows that men are significantly better than women at perceiving verticality and horizontality. It tests the hypothesis according to which men, as compared to women, exhibit a greater independence on the perceptive visual field, and therefore are more susceptible to utilizing the verticality and the horizontality to solve a mental rotation task. A first set of experiments examined spatial perception in order to assess its impact on the resolution of a task involving mental rotation. The results showed that only men referred to the verticality and the horizontality to solve the task. A second series of experiments investigated the effect of a presence, or absence of directional axes directed tied to the task of mental rotation. They were also conducted in a real world environment to evaluate how the active or passive displacement, corresponding to a change in perspective instead of a mental rotation, modulates the performance of men and women. The results did not show sex differences. Our hypothesis is verified: it is only when the task presents no obvious, but implicit orthogonal axes that men, who exhibit a greater independence on the perceptive visual field than women, use the perception of verticality and horizontality to improve their competence in mental rotation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fixed Mobile Convergence is the recent buzz in the field of telecommunication technology. Unlicensed Mobile Access (UMA) technology is a realistic implementation of Fixed Mobile Convergence. UMA involves communication between different types of networks. Handover is a very important issue in UMA. The study is an analysis of theoretical handover mechanism and practical test results. It includes a new proposal for handover performance test in UMA. It also provides an overview of basic handover operation on different scenarios in UMA. The practical test involves an experiment on handover performance test using network analyzers. The new proposal provides a different approach for an experimental setting on handover performance test without using network analyzers. The approach is not be implemented because of some technical problem in a network equipment in UMA. The analysis of the test results reveals that time of handover between UMA and Global System for Mobile Communication (GSM) network is similar to time of handover between inter base station controller (BSC) handover in GSM networks. The new approach is simple and provides measurement at the end point communicating entities. The study gives a general understanding of handover operation, an analysis of handover performance in UMA and specifically provides a new approach useful for further study of handover in different real world environments and scenarios.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this paper is to give a presentation of the programme eTwinning, the use of CLIL methodology in eTwinning projects, give a glimpse of a successful project carried out by secondary students and present the future ahead regarding Higher Education. eTwinning offers the suitable environment to use the English language in a “real” context; it can be integrated in any subject due to its cross-curricular nature. In short, it prepares the student for the real world: international research, to get to know other cultures, to communicate and to learn content. I will start by giving a general view of what eTwinning is about. The second part will deal will eTwinning and CLIL. How CLIL methodology fits perfectly in the carrying out of eTwinning projects. In the third part, and drawn from personal experience, a project will be shown: “Addressing the Energy Crunch; Every Little Action Helps” as a good example of how to integrate content-learning in a collaborative project between different schools in Europe. The last part will deal with the future of eTwinning and Higher Education, within the new programme just approved by the European Parliament: Erasmus+ (2014-20).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We propose a task for eliciting attitudes toward risk that is close to real-world risky decisions which typically involve gains and losses. The task consists of accepting or rejecting gambles that provide a gain with probability p and a loss with probability 1−p . We employ finite mixture models to uncover heterogeneity in risk preferences and find that (i) behavior is heterogeneous, with one half of the subjects behaving as expected utility maximizers, (ii) for the others, reference-dependent models perform better than those where subjects derive utility from final outcomes, (iii) models with sign-dependent decision weights perform better than those without, and (iv) there is no evidence for loss aversion. The procedure is sufficiently simple so that it can be easily used in field or lab experiments where risk elicitation is not the main experiment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis develops a comprehensive and a flexible statistical framework for the analysis and detection of space, time and space-time clusters of environmental point data. The developed clustering methods were applied in both simulated datasets and real-world environmental phenomena; however, only the cases of forest fires in Canton of Ticino (Switzerland) and in Portugal are expounded in this document. Normally, environmental phenomena can be modelled as stochastic point processes where each event, e.g. the forest fire ignition point, is characterised by its spatial location and occurrence in time. Additionally, information such as burned area, ignition causes, landuse, topographic, climatic and meteorological features, etc., can also be used to characterise the studied phenomenon. Thereby, the space-time pattern characterisa- tion represents a powerful tool to understand the distribution and behaviour of the events and their correlation with underlying processes, for instance, socio-economic, environmental and meteorological factors. Consequently, we propose a methodology based on the adaptation and application of statistical and fractal point process measures for both global (e.g. the Morisita Index, the Box-counting fractal method, the multifractal formalism and the Ripley's K-function) and local (e.g. Scan Statistics) analysis. Many measures describing the space-time distribution of environmental phenomena have been proposed in a wide variety of disciplines; nevertheless, most of these measures are of global character and do not consider complex spatial constraints, high variability and multivariate nature of the events. Therefore, we proposed an statistical framework that takes into account the complexities of the geographical space, where phenomena take place, by introducing the Validity Domain concept and carrying out clustering analyses in data with different constrained geographical spaces, hence, assessing the relative degree of clustering of the real distribution. Moreover, exclusively to the forest fire case, this research proposes two new methodologies to defining and mapping both the Wildland-Urban Interface (WUI) described as the interaction zone between burnable vegetation and anthropogenic infrastructures, and the prediction of fire ignition susceptibility. In this regard, the main objective of this Thesis was to carry out a basic statistical/- geospatial research with a strong application part to analyse and to describe complex phenomena as well as to overcome unsolved methodological problems in the characterisation of space-time patterns, in particular, the forest fire occurrences. Thus, this Thesis provides a response to the increasing demand for both environmental monitoring and management tools for the assessment of natural and anthropogenic hazards and risks, sustainable development, retrospective success analysis, etc. The major contributions of this work were presented at national and international conferences and published in 5 scientific journals. National and international collaborations were also established and successfully accomplished. -- Cette thèse développe une méthodologie statistique complète et flexible pour l'analyse et la détection des structures spatiales, temporelles et spatio-temporelles de données environnementales représentées comme de semis de points. Les méthodes ici développées ont été appliquées aux jeux de données simulées autant qu'A des phénomènes environnementaux réels; nonobstant, seulement le cas des feux forestiers dans le Canton du Tessin (la Suisse) et celui de Portugal sont expliqués dans ce document. Normalement, les phénomènes environnementaux peuvent être modélisés comme des processus ponctuels stochastiques ou chaque événement, par ex. les point d'ignition des feux forestiers, est déterminé par son emplacement spatial et son occurrence dans le temps. De plus, des informations tels que la surface bru^lée, les causes d'ignition, l'utilisation du sol, les caractéristiques topographiques, climatiques et météorologiques, etc., peuvent aussi être utilisées pour caractériser le phénomène étudié. Par conséquent, la définition de la structure spatio-temporelle représente un outil puissant pour compren- dre la distribution du phénomène et sa corrélation avec des processus sous-jacents tels que les facteurs socio-économiques, environnementaux et météorologiques. De ce fait, nous proposons une méthodologie basée sur l'adaptation et l'application de mesures statistiques et fractales des processus ponctuels d'analyse global (par ex. l'indice de Morisita, la dimension fractale par comptage de boîtes, le formalisme multifractal et la fonction K de Ripley) et local (par ex. la statistique de scan). Des nombreuses mesures décrivant les structures spatio-temporelles de phénomènes environnementaux peuvent être trouvées dans la littérature. Néanmoins, la plupart de ces mesures sont de caractère global et ne considèrent pas de contraintes spatiales com- plexes, ainsi que la haute variabilité et la nature multivariée des événements. A cet effet, la méthodologie ici proposée prend en compte les complexités de l'espace géographique ou le phénomène a lieu, à travers de l'introduction du concept de Domaine de Validité et l'application des mesures d'analyse spatiale dans des données en présentant différentes contraintes géographiques. Cela permet l'évaluation du degré relatif d'agrégation spatiale/temporelle des structures du phénomène observé. En plus, exclusif au cas de feux forestiers, cette recherche propose aussi deux nouvelles méthodologies pour la définition et la cartographie des zones périurbaines, décrites comme des espaces anthropogéniques à proximité de la végétation sauvage ou de la forêt, et de la prédiction de la susceptibilité à l'ignition de feu. A cet égard, l'objectif principal de cette Thèse a été d'effectuer une recherche statistique/géospatiale avec une forte application dans des cas réels, pour analyser et décrire des phénomènes environnementaux complexes aussi bien que surmonter des problèmes méthodologiques non résolus relatifs à la caractérisation des structures spatio-temporelles, particulièrement, celles des occurrences de feux forestières. Ainsi, cette Thèse fournit une réponse à la demande croissante de la gestion et du monitoring environnemental pour le déploiement d'outils d'évaluation des risques et des dangers naturels et anthro- pogéniques. Les majeures contributions de ce travail ont été présentées aux conférences nationales et internationales, et ont été aussi publiées dans 5 revues internationales avec comité de lecture. Des collaborations nationales et internationales ont été aussi établies et accomplies avec succès.