896 resultados para New career models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Resumen tomado de la publicación. Con el apoyo económico del departamento MIDE de la UNED

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The high level of realism and interaction in many computer graphic applications requires techniques for processing complex geometric models. First, we present a method that provides an accurate low-resolution approximation from a multi-chart textured model that guarantees geometric fidelity and correct preservation of the appearance attributes. Then, we introduce a mesh structure called Compact Model that approximates dense triangular meshes while preserving sharp features, allowing adaptive reconstructions and supporting textured models. Next, we design a new space deformation technique called *Cages based on a multi-level system of cages that preserves the smoothness of the mesh between neighbouring cages and is extremely versatile, allowing the use of heterogeneous sets of coordinates and different levels of deformation. Finally, we propose a hybrid method that allows to apply any deformation technique on large models obtaining high quality results with a reduced memory footprint and a high performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents population dynamics models that can be applied to predict the rate of spread of the Neolithic transition (change from hunter-gathering to farming economics) across the European continent, which took place about 9000 to 5000 years ago. The first models in this thesis provide predictions at a continental scale. We develop population dynamics models with explicit kernels and apply realistic data. We also derive a new time-delayed reaction-diffusion equation which yields speeds about a 10% slower than previous models. We also deal with a regional variability: the slowdown of the Neolithic front when reaching the North of Europe. We develop simple reaction-diffusion models that can predict the measured speeds in terms of the non-homogeneous distribution of pre-Neolithic (Mesolithic) population in Europe, which were present in higher densities at the North of the continent. Such models can explain the observed speeds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral thesis offers a quantitative and qualitative analysis of the changes in the urban shape and landscape of the Girona Counties between 1979 and 2006. The theoretical part of the research lies within the framework of the dispersed city phenomenon, and is based on the hypothesis of convergence towards a global urban model. The empirical part demonstrates this proposition with a study of 522 zone development plans in the Girona Counties. The results point to the consolidation of the dispersed city phenomenon, as shown by the sudden increase in built-up space, the spread of urban development throughout the territory, and the emergence of a new, increasingly generic landscape comprising three major morphological types: urban extensions, low density residential estates and industrial zones. This reveals shortcomings of planning for urban growth, weakening of the city as a public project, and a certain degradation of the Mediterranean city model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Es desenvolupa una eina de disseny per l'anàlisi de la tolerància al dany en composites. L'eina pot predir el inici i la propagació de fisures interlaminars. També pot ser utilitzada per avaluar i planificar la necessitat de reparar o reemplaçar components durant la seva vida útil. El model desenvolupat pot ser utilitzat tan per simular càrregues estàtiques com de fatiga. El model proposat és un model de dany termodinàmicament consistent que permet simular la delaminació en composites sota càrregues variables. El model es formula dins el context de la Mecànica del Dany, fent ús dels models de zona cohesiva. Es presenta un metodologia per determinar els paràmetres del model constitutiu que permet utilitzar malles d'elements finits més bastes de les que es poden usar típicament. Finalment, el model és també capaç de simular la delaminació produïda per càrregues de fatiga.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En años recientes,la Inteligencia Artificial ha contribuido a resolver problemas encontrados en el desempeño de las tareas de unidades informáticas, tanto si las computadoras están distribuidas para interactuar entre ellas o en cualquier entorno (Inteligencia Artificial Distribuida). Las Tecnologías de la Información permiten la creación de soluciones novedosas para problemas específicos mediante la aplicación de los hallazgos en diversas áreas de investigación. Nuestro trabajo está dirigido a la creación de modelos de usuario mediante un enfoque multidisciplinario en los cuales se emplean los principios de la psicología, inteligencia artificial distribuida, y el aprendizaje automático para crear modelos de usuario en entornos abiertos; uno de estos es la Inteligencia Ambiental basada en Modelos de Usuario con funciones de aprendizaje incremental y distribuido (conocidos como Smart User Model). Basándonos en estos modelos de usuario, dirigimos esta investigación a la adquisición de características del usuario importantes y que determinan la escala de valores dominantes de este en aquellos temas en los cuales está más interesado, desarrollando una metodología para obtener la Escala de Valores Humanos del usuario con respecto a sus características objetivas, subjetivas y emocionales (particularmente en Sistemas de Recomendación).Una de las áreas que ha sido poco investigada es la inclusión de la escala de valores humanos en los sistemas de información. Un Sistema de Recomendación, Modelo de usuario o Sistemas de Información, solo toman en cuenta las preferencias y emociones del usuario [Velásquez, 1996, 1997; Goldspink, 2000; Conte and Paolucci, 2001; Urban and Schmidt, 2001; Dal Forno and Merlone, 2001, 2002; Berkovsky et al., 2007c]. Por lo tanto, el principal enfoque de nuestra investigación está basado en la creación de una metodología que permita la generación de una escala de valores humanos para el usuario desde el modelo de usuario. Presentamos resultados obtenidos de un estudio de casos utilizando las características objetivas, subjetivas y emocionales en las áreas de servicios bancarios y de restaurantes donde la metodología propuesta en esta investigación fue puesta a prueba.En esta tesis, las principales contribuciones son: El desarrollo de una metodología que, dado un modelo de usuario con atributos objetivos, subjetivos y emocionales, se obtenga la Escala de Valores Humanos del usuario. La metodología propuesta está basada en el uso de aplicaciones ya existentes, donde todas las conexiones entre usuarios, agentes y dominios que se caracterizan por estas particularidades y atributos; por lo tanto, no se requiere de un esfuerzo extra por parte del usuario.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tesi es centra en una anàlisi prèvia de les implicacions de la teoria del cicle de vida aplicada a les destinacions turístiques pel geògraf canadenc Richard W. Butler (1982) i en un estudi del significat de l'aparició del postmodernisme i les seves repercussions en el turisme, a la vegada que es resegueix la història que ha fet possible l'aparició del turisme massificat de sol i platja. Tot això és la base teòrica imprescindible per poder estudiar empíricament les destinacions de Saltburn (Regne Unit), l'Estartit (Costa Brava) i Cayo Coco (Cuba), utilitzant el cicle de vida per escriure la seva història i veure les polítiques de desenvolupament i regeneració turística que s'han seguit i si aquestes es poden qualificar de postfordistes. La principal conclusió és que el concepte del cicle de vida es limita a ser una eina de diagnòstic a posteriori. El cicle de vida, com a instrument prescriptiu, no serveix. Es específic per a cada destinació, amb unes etapes i punts d'inflexió que només es poden establir en restrospectiva. El cicle de vida és una eina descriptiva molt útil per entendre l'evolució de les destinacions turístiques i els seus mercats, però vigilant de no caure en falses exploracions o perillosos determinismes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Declining grassland breeding bird populations have led to increased efforts to assess habitat quality, typically by estimating density or relative abundance. Because some grassland habitats may function as ecological traps, a more appropriate metric for determining quality may be breeding success. Between 1994 and 2003 we gathered data on the nest fates of Eastern Meadowlarks (Sturnella magna), Bobolinks (Dolichonyx oryzivorous), and Savannah Sparrows (Passerculus sandwichensis) in a series of fallow fields and pastures/hayfields in western New York State. We calculated daily survival probabilities using the Mayfield method, and used the logistic-exposure method to model effects of predictor variables on nest success. Nest survival probabilities were 0.464 for Eastern Meadowlarks (n = 26), 0.483 for Bobolinks (n = 91), and 0.585 for Savannah Sparrows (n = 152). Fledge dates for first clutches ranged between 14 June and 23 July. Only one obligate grassland bird nest was parasitized by Brown-headed Cowbirds (Molothrus ater), for an overall brood parasitism rate of 0.004. Logistic-exposure models indicated that daily nest survival probabilities were higher in pastures/hayfields than in fallow fields. Our results, and those from other studies in the Northeast, suggest that properly managed cool season grassland habitats in the region may not act as ecological traps, and that obligate grassland birds in the region may have greater nest survival probabilities, and lower rates of Brown-headed Cowbird parasitism, than in many parts of the Midwest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new algorithm is described for refining the pose of a model of a rigid object, to conform more accurately to the image structure. Elemental 3D forces are considered to act on the model. These are derived from directional derivatives of the image local to the projected model features. The convergence properties of the algorithm is investigated and compared to a previous technique. Its use in a video sequence of a cluttered outdoor traffic scene is also illustrated and assessed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This workshop paper reports recent developments to a vision system for traffic interpretation which relies extensively on the use of geometrical and scene context. Firstly, a new approach to pose refinement is reported, based on forces derived from prominent image derivatives found close to an initial hypothesis. Secondly, a parameterised vehicle model is reported, able to represent different vehicle classes. This general vehicle model has been fitted to sample data, and subjected to a Principal Component Analysis to create a deformable model of common car types having 6 parameters. We show that the new pose recovery technique is also able to operate on the PCA model, to allow the structure of an initial vehicle hypothesis to be adapted to fit the prevailing context. We report initial experiments with the model, which demonstrate significant improvements to pose recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new formulation of a pose refinement technique using ``active'' models is described. An error term derived from the detection of image derivatives close to an initial object hypothesis is linearised and solved by least squares. The method is particularly well suited to problems involving external geometrical constraints (such as the ground-plane constraint). We show that the method is able to recover both the pose of a rigid model, and the structure of a deformable model. We report an initial assessment of the performance and cost of pose and structure recovery using the active model in comparison with our previously reported ``passive'' model-based techniques in the context of traffic surveillance. The new method is more stable, and requires fewer iterations, especially when the number of free parameters increases, but shows somewhat poorer convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new technique is described for the analysis of cloud-resolving model simulations, which allows one to investigate the statistics of the lifecycles of cumulus clouds. Clouds are tracked from timestep-to-timestep within the model run. This allows for a very simple method of tracking, but one which is both comprehensive and robust. An approach for handling cloud splits and mergers is described which allows clouds with simple and complicated time histories to be compared within a single framework. This is found to be important for the analysis of an idealized simulation of radiative-convective equilibrium, in which the moist, buoyant, updrafts (i.e., the convective cores) were tracked. Around half of all such cores were subject to splits and mergers during their lifecycles. For cores without any such events, the average lifetime is 30min, but events can lengthen the typical lifetime considerably.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them also involves complicated workflows implemented as shell scripts. A new grid middleware system that is well suited to climate modelling applications is presented in this paper. Grid Remote Execution (G-Rex) allows climate models to be deployed as Web services on remote computer systems and then launched and controlled as if they were running on the user's own computer. Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model. G-Rex has a REST architectural style, featuring a Java client program that can easily be incorporated into existing scientific workflow scripts. Some technical details of G-Rex are presented, with examples of its use by climate modellers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite its relevance to a wide range of technological and fundamental areas, a quantitative understanding of protein surface clustering dynamics is often lacking. In inorganic crystal growth, surface clustering of adatoms is well described by diffusion-aggregation models. In such models, the statistical properties of the aggregate arrays often reveal the molecular scale aggregation processes. We investigate the potential of these theories to reveal hitherto hidden facets of protein clustering by carrying out concomitant observations of lysozyme adsorption onto mica surfaces, using atomic force microscopy. and Monte Carlo simulations of cluster nucleation and growth. We find that lysozyme clusters diffuse across the substrate at a rate that varies inversely with size. This result suggests which molecular scale mechanisms are responsible for the mobility of the proteins on the substrate. In addition the surface diffusion coefficient of the monomer can also be extracted from the comparison between experiments and simulations. While concentrating on a model system of lysozyme-on-mica, this 'proof of concept' study successfully demonstrates the potential of our approach to understand and influence more biomedically applicable protein-substrate couples.