974 resultados para Schermi, adattativi, pervasive, kinect, framework, ingegnerizzazione, OpenNI
Resumo:
Les données sont analysées par le logiciel conçu par François Courtemanche et Féthi Guerdelli. L'expérimentation des jeux a eu lieu au Laboratoire de recherche en communication multimédia de l'Université de Montréal.
Resumo:
Christine Riefa is a lecturer in Consumer Law and Intellectual Property Law at Brunel University in the UK. She is an elected board member of the International Association of Consumer Law and an academic correspondent to the Euro-American Chair for the legal protection of consumers (University of Cantabria, Spain). In 2009-2010, Dr Riefa is a Fulbright EU Scholar-in-Residence at Cleveland-Marshall College of Law, Ohio USA. A first version of this article was presented at the Summer School in Consumer Law, organised by the GREDICC (Groupe de recherche en droit international et comparé de la consummation), UQAM, Montréal, 29th June – 4th July 2009.
Resumo:
Ce mémoire de maîtrise cherche à jeter un regard approfondi sur les cas des jeunes contrevenants référés au processus de médiation à Trajet, un organisme de justice alternative à Montréal. Plus précisément, les objectifs sont de décrire les caractéristiques des cas référés, d’explorer leur relation avec la participation au processus de médiation et avec le résultat de celui-ci, et de comparer ces mêmes éléments en regard de deux périodes inclues dans le projet : celle où s’appliquait la Loi sur les jeunes contrevenants et celle où la Loi sur le système de justice pénale pour les adolescents assortie de l’Entente cadre sont entrés en vigueur. Des méthodes de recherche quantitatives ont été utilisées pour analyser les cas référés à Trajet sur une période de 10 ans (1999-2009). Des analyses descriptives ont permis d’établir les caractéristiques communes ou divergentes entre les cas référés à Trajet et ceux référés à d’autres programmes de médiation. Des analyses bi-variées ont révélé qu’une relation significative existait entre la participation au processus de médiation et l’âge et le sexe des contrevenants, le nombre de crimes commis par ceux-ci, le nombre de victimes impliquées, le type de victime, l’âge et le sexe des victimes et, le délai entre la commission du crime et le transfert du dossier à Trajet. La réalisation d’une régression logistique a révélé que trois caractéristiques prédisent de manière significative la participation à la médiation : l’âge des contrevenants, le nombre de victimes impliquées et le délai entre la commission du crime et le transfert du dossier à Trajet. La faible proportion d’échecs du processus de médiation a rendu inutile la réalisation d’analyses bi et multi-variées eu égard au résultat du processus de médiation. Des différences significatives ont été trouvées entre les cas référés en médiation sous la Loi sur les jeunes contrevenants et ceux référés sous la Loi sur le système de justice pénale pour les adolescents assortie à l’Entente cadre en ce qui a trait au type de crime, au nombre de délits commis, à l’existence d’une référence précédente à Trajet, aux raisons pour lesquelles la médiation n’a pas eu lieu, à la restitution sous toutes ces formes et, plus spécialement, la restitution financière. La participation à la médiation est apparue plus probable sous la LSJPA que sous la LJC. Des corrélations partielles ont montré que différentes caractéristiques étaient associées à la participation à la médiation dans les deux périodes en question. Seule une caractéristique, le sexe des victimes, s’est avérée reliée significativement à la participation à la médiation tant sous la LJC que sous la LSJPA. Les résultats de ce projet ont donné lieu à une connaissance plus approfondie des cas référés à Trajet pour un processus de médiation et à une exploration de l’impact que la LSJPA et l’Entente cadre sur ce processus. Toutefois, l’échantillon étant limité au cas traités à Trajet ne permet pas la généralisation de ces résultats à l’ensemble des cas référés aux organismes de justice alternative du Québec pour le processus de médiation.
Resumo:
Les logiciels sont de plus en plus complexes et leur développement est souvent fait par des équipes dispersées et changeantes. Par ailleurs, de nos jours, la majorité des logiciels sont recyclés au lieu d’être développés à partir de zéro. La tâche de compréhension, inhérente aux tâches de maintenance, consiste à analyser plusieurs dimensions du logiciel en parallèle. La dimension temps intervient à deux niveaux dans le logiciel : il change durant son évolution et durant son exécution. Ces changements prennent un sens particulier quand ils sont analysés avec d’autres dimensions du logiciel. L’analyse de données multidimensionnelles est un problème difficile à résoudre. Cependant, certaines méthodes permettent de contourner cette difficulté. Ainsi, les approches semi-automatiques, comme la visualisation du logiciel, permettent à l’usager d’intervenir durant l’analyse pour explorer et guider la recherche d’informations. Dans une première étape de la thèse, nous appliquons des techniques de visualisation pour mieux comprendre la dynamique des logiciels pendant l’évolution et l’exécution. Les changements dans le temps sont représentés par des heat maps. Ainsi, nous utilisons la même représentation graphique pour visualiser les changements pendant l’évolution et ceux pendant l’exécution. Une autre catégorie d’approches, qui permettent de comprendre certains aspects dynamiques du logiciel, concerne l’utilisation d’heuristiques. Dans une seconde étape de la thèse, nous nous intéressons à l’identification des phases pendant l’évolution ou pendant l’exécution en utilisant la même approche. Dans ce contexte, la prémisse est qu’il existe une cohérence inhérente dans les évènements, qui permet d’isoler des sous-ensembles comme des phases. Cette hypothèse de cohérence est ensuite définie spécifiquement pour les évènements de changements de code (évolution) ou de changements d’état (exécution). L’objectif de la thèse est d’étudier l’unification de ces deux dimensions du temps que sont l’évolution et l’exécution. Ceci s’inscrit dans notre volonté de rapprocher les deux domaines de recherche qui s’intéressent à une même catégorie de problèmes, mais selon deux perspectives différentes.
Resumo:
Le problème d'allocation de postes d'amarrage (PAPA) est l'un des principaux problèmes de décision aux terminaux portuaires qui a été largement étudié. Dans des recherches antérieures, le PAPA a été reformulé comme étant un problème de partitionnement généralisé (PPG) et résolu en utilisant un solveur standard. Les affectations (colonnes) ont été générées a priori de manière statique et fournies comme entrée au modèle %d'optimisation. Cette méthode est capable de fournir une solution optimale au problème pour des instances de tailles moyennes. Cependant, son inconvénient principal est l'explosion du nombre d'affectations avec l'augmentation de la taille du problème, qui fait en sorte que le solveur d'optimisation se trouve à court de mémoire. Dans ce mémoire, nous nous intéressons aux limites de la reformulation PPG. Nous présentons un cadre de génération de colonnes où les affectations sont générées de manière dynamique pour résoudre les grandes instances du PAPA. Nous proposons un algorithme de génération de colonnes qui peut être facilement adapté pour résoudre toutes les variantes du PAPA en se basant sur différents attributs spatiaux et temporels. Nous avons testé notre méthode sur un modèle d'allocation dans lequel les postes d'amarrage sont considérés discrets, l'arrivée des navires est dynamique et finalement les temps de manutention dépendent des postes d'amarrage où les bateaux vont être amarrés. Les résultats expérimentaux des tests sur un ensemble d'instances artificielles indiquent que la méthode proposée permet de fournir une solution optimale ou proche de l'optimalité même pour des problème de très grandes tailles en seulement quelques minutes.
Resumo:
High school dropout is commonly seen as the result of a long-term process of failure and disengagement. As useful as it is, this view has obscured the heterogeneity of pathways leading to dropout. Research suggests, for instance, that some students leave school not as a result of protracted difficulties but in response to situations that emerge late in their schooling careers, such as health problems or severe peer victimization. Conversely, others with a history of early difficulties persevere when their circumstances improve during high school. Thus, an adequate understanding of why and when students drop out requires a consideration of both long-term vulnerabilities and proximal disruptive events and contingencies. The goal of this review is to integrate long-term and immediate determinants of dropout by proposing a stress process, life course model of dropout. This model is also helpful for understanding how the determinants of dropout vary across socioeconomic conditions and geographical and historical contexts.
Resumo:
Electron-phonon interaction is considered within the framework of the fluctuating valence of Cu atoms. Anderson's lattice Hamiltonian is suitably modified to take this into account. Using Green's function technique tbe possible quasiparticle excitations' are determined. The quantity 2delta k(O)/ kB Tc is calculated for Tc= 40 K. The calculated values are in good agreement with the experimental results.
Resumo:
Due to the advancement in mobile devices and wireless networks mobile cloud computing, which combines mobile computing and cloud computing has gained momentum since 2009. The characteristics of mobile devices and wireless network makes the implementation of mobile cloud computing more complicated than for fixed clouds. This section lists some of the major issues in Mobile Cloud Computing. One of the key issues in mobile cloud computing is the end to end delay in servicing a request. Data caching is one of the techniques widely used in wired and wireless networks to improve data access efficiency. In this paper we explore the possibility of a cooperative caching approach to enhance data access efficiency in mobile cloud computing. The proposed approach is based on cloudlets, one of the architecture designed for mobile cloud computing.
Resumo:
In this paper we describe the methodology and the structural design of a system that translates English into Malayalam using statistical models. A monolingual Malayalam corpus and a bilingual English/Malayalam corpus are the main resource in building this Statistical Machine Translator. Training strategy adopted has been enhanced by PoS tagging which helps to get rid of the insignificant alignments. Moreover, incorporating units like suffix separator and the stop word eliminator has proven to be effective in bringing about better training results. In the decoder, order conversion rules are applied to reduce the structural difference between the language pair. The quality of statistical outcome of the decoder is further improved by applying mending rules. Experiments conducted on a sample corpus have generated reasonably good Malayalam translations and the results are verified with F measure, BLEU and WER evaluation metrics
Resumo:
A methodology for translating text from English into the Dravidian language, Malayalam using statistical models is discussed in this paper. The translator utilizes a monolingual Malayalam corpus and a bilingual English/Malayalam corpus in the training phase and generates automatically the Malayalam translation of an unseen English sentence. Various techniques to improve the alignment model by incorporating the morphological inputs into the bilingual corpus are discussed. Removing the insignificant alignments from the sentence pairs by this approach has ensured better training results. Pre-processing techniques like suffix separation from the Malayalam corpus and stop word elimination from the bilingual corpus also proved to be effective in producing better alignments. Difficulties in translation process that arise due to the structural difference between the English Malayalam pair is resolved in the decoding phase by applying the order conversion rules. The handcrafted rules designed for the suffix separation process which can be used as a guideline in implementing suffix separation in Malayalam language are also presented in this paper. Experiments conducted on a sample corpus have generated reasonably good Malayalam translations and the results are verified with F measure, BLEU and WER evaluation metrics
Resumo:
Anticipating the increase in video information in future, archiving of news is an important activity in the visual media industry. When the volume of archives increases, it will be difficult for journalists to find the appropriate content using current search tools. This paper provides the details of the study we conducted about the news extraction systems used in different news channels in Kerala. Semantic web technologies can be used effectively since news archiving share many of the characteristics and problems of WWW. Since visual news archives of different media resources follow different metadata standards, interoperability between the resources is also an issue. World Wide Web Consortium has proposed a draft for an ontology framework for media resource which addresses the intercompatiblity issues. In this paper, the w3c proposed framework and its drawbacks is also discussed
Resumo:
The tough competition in the global and national markets and new trends in consumerism resulted in an increase in the volume of advertisements. Sometimes advertisers are successful in achieving their intended objectives with a particular advertisement and sometimes they are not .These factors contributed a lot towards the decision making problems of advertising agencies with regard to the selection of appropriate advertising strategies and tactics. The tough competition and large volume of advertising make the consumers confused and this even created doubts in the minds of consumers about the genuineness and reliability of manufacturers and products. These factors caused a query regarding the active role of credibility element in advertising. The proposed study examines the effects of advertising credibility in consumer health care non durable product advertising on communication effect, purchase behavior and ad skepticism. This paper examines the need for the study of advertising credibility and reviews the advertising- consumer behaviour- credibility – healthcare theories which form a basis for the study. It identifies the different components and dimensions of advertising credibility and the importance of communication effect, purchase behavior and ad skepticism. It also studies the relevance of credibility in the consumer healthcare products advertising and suggests a Theoretical Framework for the proposed study
Resumo:
This paper describes a novel framework for automatic segmentation of primary tumors and its boundary from brain MRIs using morphological filtering techniques. This method uses T2 weighted and T1 FLAIR images. This approach is very simple, more accurate and less time consuming than existing methods. This method is tested by fifty patients of different tumor types, shapes, image intensities, sizes and produced better results. The results were validated with ground truth images by the radiologist. Segmentation of the tumor and boundary detection is important because it can be used for surgical planning, treatment planning, textural analysis, 3-Dimensional modeling and volumetric analysis
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------