902 resultados para Unstructured Grids


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Para el administrador el proceso de la toma de decisiones es uno de sus mayores retos y responsabilidades, ya que en su desarrollo se debe definir el camino más acertado en un sin número de alternativas, teniendo en cuenta los obstáculos sociales, políticos y económicos del entorno empresarial. Para llegar a la decisión adecuada no hay que perder de vista los objetivos y metas propuestas, además de tener presente el proceso lógico, detectando, analizando y demostrando el porqué de esa elección. Consecuentemente el análisis que propone esta investigación aportara conocimientos sobre los tipos de lógica utilizados en la toma de decisiones estratégicas al administrador para satisfacer las demandas asociadas con el mercadeo para que de esta manera se pueda generar y ampliar eficientemente las competencia idóneas del administrador en la inserción internacional de un mercado laboral cada vez mayor (Valero, 2011). A lo largo de la investigación se pretende desarrollar un estudio teórico para explicar la relación entre la lógica y la toma de decisiones estratégicas de marketing y como estos conceptos se combinan para llegar a un resultado final. Esto se llevara a cabo por medio de un análisis de planes de marketing, iniciando por conceptos básicos como marketing, lógica, decisiones estratégicas, dirección de marketing seguido de los principios lógicos y contradicciones que se pueden llegar a generar entre la fundamentación teórica

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estudiar la percepción personal y grupal de normas y valores asumidos, imágenes y creencias estereotipadas, códigos y esquemas sociales cristalizados (actitudes individuales y grupales), rutas y trayectorias particulares en relación con la violencia exogrupal juvenil. Validar cualitativamente la Teoría del Comportamiento Planificado y, eventualmente, proponer modificaciones teóricas, cuya eficacia explicativa pueda ser comprobada. 19 jóvenes (17 varones y 2 mujeres) de edades entre 15 y 25 años, residentes en Madrid. Todos ellos cumplen los criterios de selección: durante el último año han agredido físicamente, en dos o más ocasiones y en tanto que miembros de grupo, a una o más personas pertenecientes a otros grupos. Entrevistas individuales en profundidad, semiestructuradas, con bajo nivel de directividad y aplicadas en dos sesiones. El protocolo de la primera incluía las variables y los factores más citados por la literatura especializada, estructurada en tres niveles: macrosocial (cultura y socioeconomía), mesosocial (identidad social) y microsocial (identidad personal y fenomenología de la conducta). Para la segunda sesión se elaboró, partiendo del análisis de la primera entrevista, un guión personalizado con una parte común y una parte adaptada a las peculiaridades del informador y de su grupo. Las entrevistas fueron grabadas y transcritas posteriormente. Con el material resultante se realizó un análisis de contenido para operativizar las variables incluidas en el modelo causal desarrollado. También se desarrolló un procedimiento de análisis cualitativo de carácter mixto, con el objetivo de formular, refutar y modificar hipótesis sobre el comportamiento exogrupal violento. Las declaraciones de los sujetos fueron categorizadas y codificadas mediante el programa ARS-NUDIST (Non-Numerical Unstructured Data Indexing Searching and Theorizing). Se usaron entrevistas. Se ha evidenciado el carácter procesual y sistémico de la adquisición y evolución de la violencia exogrupal juvenil. No debe colegirse de esta premisa que este comportamiento sea resultado en cada ocasión de la incidencia de todos los factores analizados; se postula una influencia progresiva convergente de variables macro, meso y microsociales que predisponen al joven a conformarse a la violencia y, después, en ocasiones, a interiorizarla como elemento básico de su identidad social y/o personal. El apoyo social y la autoestima individual y social, concentrados ambos, de forma preferente, en el grupo de iguales, que parece sustituir o complementar la escasa o inadecuada influencia socializadora de otras personas, grupos e instituciones; complementariamente, se valora muy negativamente la soledad, el aislamiento. Los programas preventivos y de intervención deberían evitar la tentación de psicologizar el problema, centrándose en el individuo, propiciando alternativa o complementariamente el desarrollo de normas y conductas prosociales y generalizables a diferentes ámbitos. El principal objetivo de los programas para reducir la violencia exogrupal es la promoción de identidades personal y social positivas, a través de la realización de conductas valoradas socialmente.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Determinar si los centros de educación separada en secundaria ofrecen una alternativa a la coeducación más eficaz para conseguir una educación integral óptima. . Muestra compuesta por 1532 alumnos de segundo y tercero de ESO de edades comprendidas entre los 13 y los 15 años. Los centros pertenecen a tres comunidades autónomas: Madrid, Cataluña y Murcia, en total: 12 centros: 5 mixtos, 3 femeninos y 4 masculinos. Se seleccionaron clases enteras tratando de igualar los grupos a comparar respecto a: nivel socioeconómico, edad, curso académico y sexo. A cada alumno se le aplicaron los siguientes cuestionarios: cuestionarios de autoestima (Coopersmith, 1986); Cuestionario sobre Conductas de Riesgo (González Molina, 2006); Escala de Valores (Mitchell, 1984) e Inventario de Transtornos de la Conducta Alimentaria (EDI, Garner y col., 1983). Se practica un análisis de los resultados descriptivo y correlacional buscando diferencias significativas entre las muestras. El análisis cualitativo se ha realizado a partir de una serie de entrevistas semiestructuradas realizadas a los siguientes sujetos: alumnos de segundo y tercero de ESO, profesores de esos mismos cursos y padres de los alumnos. Se entrevista a un total de 48 sujetos. las entrevistas fueron analizadas según la estructura del programa NUDIST para tratamiento de los datos cualitativos, programa que va más allá del modelo de archivador que se limita a codificar y recuperar el texto. NUDIST (Non-numerical Unstructured Data, Indexing, Searching and Theorising) Se comparan los resultados de los análisis cuantitativo y cualitativo.. La coeducación favorece más a los chicos, sin que se pueda relacionar este dato con el éxito escolar. El autoconcepto más bajo se encuentra en el grupo de mujeres de coeducación, existiendo diferencias con las mujeres que reciben educación en centros de un solo sexo, de lo que se deduce que la problemática que tienen las adolescentes en esta etapa de su vida se acrecienta cuando están en centros de coeducación. Los alumnos de centros de educación separada poseen un comportamiento más sincero, se arrepienten con mayor naturalidad de aquello que consideran que han hecho mal y son más capaces de reconocer que les gustaría cambiar aspectos de su carácter. En relación con los valores, los alumnos de educación separada se diferencian claramente de los de coeducación en tener puntuaciones más altas en valores religiosos y tenerlas más bajas en valores hedonistas e individualistas. Desde el punto de vista metodológico resulta muy difícil en España encontrar centros de educación separada que sean aconfesionales. Los centros privados masculinos son los segundos en valores religiosos pero ocupan el primer lugar a la hora de valorar el status social. Hay mayor miedo a la madurez en centros de coeducación y mayor perfeccionismo en los de no-coeducación. Los aspectos personales y éticos deben de formar parte integrante de la educación puesto que es la autonomía del individuo la que mejor expresa su racionalidad. El planteamiento de esta investigación ha querido enfrentar los prejuicios establecidos contra la no-coeducación que bien pudiera ser un medio eficaz por el que habría que abogar aunque solo fuera de una manera experimental..

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El treball desenvolupat en aquesta tesi presenta un profund estudi i proveïx solucions innovadores en el camp dels sistemes recomanadors. Els mètodes que usen aquests sistemes per a realitzar les recomanacions, mètodes com el Filtrat Basat en Continguts (FBC), el Filtrat Col·laboratiu (FC) i el Filtrat Basat en Coneixement (FBC), requereixen informació dels usuaris per a predir les preferències per certs productes. Aquesta informació pot ser demogràfica (Gènere, edat, adreça, etc), o avaluacions donades sobre algun producte que van comprar en el passat o informació sobre els seus interessos. Existeixen dues formes d'obtenir aquesta informació: els usuaris ofereixen explícitament aquesta informació o el sistema pot adquirir la informació implícita disponible en les transaccions o historial de recerca dels usuaris. Per exemple, el sistema recomanador de pel·lícules MovieLens (http://movielens.umn.edu/login) demana als usuaris que avaluïn almenys 15 pel·lícules dintre d'una escala de * a * * * * * (horrible, ...., ha de ser vista). El sistema genera recomanacions sobre la base d'aquestes avaluacions. Quan els usuaris no estan registrat en el sistema i aquest no té informació d'ells, alguns sistemes realitzen les recomanacions tenint en compte l'historial de navegació. Amazon.com (http://www.amazon.com) realitza les recomanacions tenint en compte les recerques que un usuari a fet o recomana el producte més venut. No obstant això, aquests sistemes pateixen de certa falta d'informació. Aquest problema és generalment resolt amb l'adquisició d'informació addicional, se li pregunta als usuaris sobre els seus interessos o es cerca aquesta informació en fonts addicionals. La solució proposada en aquesta tesi és buscar aquesta informació en diverses fonts, específicament aquelles que contenen informació implícita sobre les preferències dels usuaris. Aquestes fonts poden ser estructurades com les bases de dades amb informació de compres o poden ser no estructurades com les pàgines web on els usuaris deixen la seva opinió sobre algun producte que van comprar o posseïxen. Nosaltres trobem tres problemes fonamentals per a aconseguir aquest objectiu: 1 . La identificació de fonts amb informació idònia per als sistemes recomanadors. 2 . La definició de criteris que permetin la comparança i selecció de les fonts més idònies. 3 . La recuperació d'informació de fonts no estructurades. En aquest sentit, en la tesi proposada s'ha desenvolupat: 1 . Una metodologia que permet la identificació i selecció de les fonts més idònies. Criteris basats en les característiques de les fonts i una mesura de confiança han estat utilitzats per a resoldre el problema de la identificació i selecció de les fonts. 2 . Un mecanisme per a recuperar la informació no estructurada dels usuaris disponible en la web. Tècniques de Text Mining i ontologies s'han utilitzat per a extreure informació i estructurar-la apropiadament perquè la utilitzin els recomanadors. Les contribucions del treball desenvolupat en aquesta tesi doctoral són: 1. Definició d'un conjunt de característiques per a classificar fonts rellevants per als sistemes recomanadors 2. Desenvolupament d'una mesura de rellevància de les fonts calculada sobre la base de les característiques definides 3. Aplicació d'una mesura de confiança per a obtenir les fonts més fiables. La confiança es definida des de la perspectiva de millora de la recomanació, una font fiable és aquella que permet millorar les recomanacions. 4. Desenvolupament d'un algorisme per a seleccionar, des d'un conjunt de fonts possibles, les més rellevants i fiable utilitzant les mitjanes esmentades en els punts previs. 5. Definició d'una ontologia per a estructurar la informació sobre les preferències dels usuaris que estan disponibles en Internet. 6. Creació d'un procés de mapatge que extreu automàticament informació de les preferències dels usuaris disponibles en la web i posa aquesta informació dintre de l'ontologia. Aquestes contribucions permeten aconseguir dos objectius importants: 1 . Millorament de les recomanacions usant fonts d'informació alternatives que sigui rellevants i fiables. 2 . Obtenir informació implícita dels usuaris disponible en Internet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective:To identify aspects that affect the quality of life of nursing caregivers and their relationship with care in an Intensive Care Unit for Adults (A-ICU). Methods:This was a descriptive study with qualitative approach, taking as subjects 21 professionals who constitute the nursing staff of the A-ICU of a school hospital in Maringá-PR. Unstructured interview was used as a strategy to collect data, conducted between May and June 2009. Data analysis was based on the method of content analysis. The categories identified were: overlooking improvement in quality of life related to the resources in an A-ICU; the quality of life influencing the form of care; interpersonal relationships into the health team reflecting on the quality of life and care. Results:The analysis of caregivers’ speech and the results of the observation showed that there is correlation between the aspects they consider influential in their quality of life and the way of caring for patients in an A-ICU.Conclusion: The findings indicate that, among the influential aspects, the stressful factors overlap the enhancing ones. From this perspective, dealing with caregiver’s suffering might be the starting point for the improvement in quality of care in an A-ICU.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A obesidade infantil apresenta-se como uma epidemia que afeta milhares de crianças em todo o Mundo, e a publicidade tem sido apontada por inúmeros autores como sendo a principal causa por este problema global. Ao longo deste estudo, tivemos como objetivo analisar se existe algum efeito da publicidade na obesidade infantil, tendo em conta o estímulo ao consumo de grande variedade de alimentos com alta intensidade calórica, por via das estratégias de marketing. Para enquadrar este estudo, procuramos identificar o papel da publicidade na vida das crianças, nas atividades desenvolvidas nos tempos livres, a influência do estilo de vida nas atividades físicas, e por fim tentar perceber o papel que a publicidade, os tempos livres e o estilo de vida têm na obesidade infantil. A investigação apresenta-se como um estudo de caso exploratório qualitativo, onde foram utilizadas entrevistas estruturadas e não estruturadas individuais, bem como de grupo utilizando a técnica de Focus Grupo num infantário privado no concelho de Cascais. A amostra analisada contou com 12 crianças com idades compreendidas entre os 2 anos e os 5 anos, 10 encarregados de educação, entre eles 3 homens e 7 mulheres com idades compreendidas entre os 27 anos e os 45 anos. Foram incluídas ainda na amostra uma educadora de infância e a directora do infantário, o que permitiu estabelecer uma relação entre as diversas variáveis em estudo. Desta forma, foi possível chegar a resultados que nos indicaram em primeiro lugar, que as crianças nesta faixa etária estão expostas a um número muito elevado de mensagens publicitárias, e que a publicidade já faz parte integrante da sua vida. Percebemos ainda que alguns pais tentam controlar o número de horas de exposição à publicidade, na televisão, bem como os canais visualizados pelos filhos. No entanto não o conseguem, por eles estarem em contato com a publicidade através dos mais variados meios. Verificou-se deste modo que as crianças já conhecem muitas marcas, embalagens e as respectivas mascotes e que por isso solicitam a sua compra. A embalagem reforçou neste estudo a sua importância, uma vez que se apresenta em alguns casos como a grande atração no ponto de venda independentemente do tipo de produto. Relativamente aos tempos livres, existe um padrão de comportamentos onde as crianças praticam mais actividades no verão e aos fins-de-semana devido à disponibilidade dos pais, sendo que no restante período o sedentarismo é muito grande e que a exposição à televisão é muito acentuada. Os factores económicos, sociais e tecnológicos apresentam-se como os principais influenciadores relativamente ao estilo de vida. Como conclusão, consideramos que a obesidade nas crianças é multifacetada e que a publicidade não é o seu principal influenciador, é apenas mais um factor que poderá agravar a situação de muitas crianças quando não têm o devido acompanhamento familiar.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The impact of humidity observations on forecast skill is explored by producing a series of global forecasts using initial data derived from the ERA-40 reanalyses system, in which all humidity data have been removed during the data assimilation. The new forecasts have been compared with the original ERA-40 analyses and forecasts made from them. Both sets of forecasts show virtually identical prediction skill in the extratropics and the tropics. Differences between the forecasts are small and undergo characteristic amplification rate. There are larger differences in temperature and geopotential in the tropics but the differences are small-scale and unstructured and have no noticeable effect on the skill of the wind forecasts. The results highlight the current very limited impact of the humidity observations, used to produce the initial state, on the forecasts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Flood modelling of urban areas is still at an early stage, partly because until recently topographic data of sufficiently high resolution and accuracy have been lacking in urban areas. However, Digital Surface Models (DSMs) generated from airborne scanning laser altimetry (LiDAR) having sub-metre spatial resolution have now become available, and these are able to represent the complexities of urban topography. The paper describes the development of a LiDAR post-processor for urban flood modelling based on the fusion of LiDAR and digital map data. The map data are used in conjunction with LiDAR data to identify different object types in urban areas, though pattern recognition techniques are also employed. Post-processing produces a Digital Terrain Model (DTM) for use as model bathymetry, and also a friction parameter map for use in estimating spatially-distributed friction coefficients. In vegetated areas, friction is estimated from LiDAR-derived vegetation height, and (unlike most vegetation removal software) the method copes with short vegetation less than ~1m high, which may occupy a substantial fraction of even an urban floodplain. The DTM and friction parameter map may also be used to help to generate an unstructured mesh of a vegetated urban floodplain for use by a 2D finite element model. The mesh is decomposed to reflect floodplain features having different frictional properties to their surroundings, including urban features such as buildings and roads as well as taller vegetation features such as trees and hedges. This allows a more accurate estimation of local friction. The method produces a substantial node density due to the small dimensions of many urban features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The and RT0 finite element schemes are among the most promising low order elements for use in unstructured mesh marine and lake models. They are both free of spurious elevation modes, have good dispersive properties and have a relatively low computational cost. In this paper, we derive both finite element schemes in the same unified framework and discuss their respective qualities in terms of conservation, consistency, propagation factor and convergence rate. We also highlight the impact that the local variables placement can have on the model solution. The main conclusion that we can draw is that the choice between elements is highly application dependent. We suggest that the element is better suited to purely hydrodynamical applications while the RT0 element might perform better for hydrological applications that require scalar transport calculations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Compute grids are used widely in many areas of environmental science, but there has been limited uptake of grid computing by the climate modelling community, partly because the characteristics of many climate models make them difficult to use with popular grid middleware systems. In particular, climate models usually produce large volumes of output data, and running them usually involves complicated workflows implemented as shell scripts. For example, NEMO (Smith et al. 2008) is a state-of-the-art ocean model that is used currently for operational ocean forecasting in France, and will soon be used in the UK for both ocean forecasting and climate modelling. On a typical modern cluster, a particular one year global ocean simulation at 1-degree resolution takes about three hours when running on 40 processors, and produces roughly 20 GB of output as 50000 separate files. 50-year simulations are common, during which the model is resubmitted as a new job after each year. Running NEMO relies on a set of complicated shell scripts and command utilities for data pre-processing and post-processing prior to job resubmission. Grid Remote Execution (G-Rex) is a pure Java grid middleware system that allows scientific applications to be deployed as Web services on remote computer systems, and then launched and controlled as if they are running on the user's own computer. Although G-Rex is general purpose middleware it has two key features that make it particularly suitable for remote execution of climate models: (1) Output from the model is transferred back to the user while the run is in progress to prevent it from accumulating on the remote system and to allow the user to monitor the model; (2) The client component is a command-line program that can easily be incorporated into existing model work-flow scripts. G-Rex has a REST (Fielding, 2000) architectural style, which allows client programs to be very simple and lightweight and allows users to interact with model runs using only a basic HTTP client (such as a Web browser or the curl utility) if they wish. This design also allows for new client interfaces to be developed in other programming languages with relatively little effort. The G-Rex server is a standard Web application that runs inside a servlet container such as Apache Tomcat and is therefore easy to install and maintain by system administrators. G-Rex is employed as the middleware for the NERC1 Cluster Grid, a small grid of HPC2 clusters belonging to collaborating NERC research institutes. Currently the NEMO (Smith et al. 2008) and POLCOMS (Holt et al, 2008) ocean models are installed, and there are plans to install the Hadley Centre’s HadCM3 model for use in the decadal climate prediction project GCEP (Haines et al., 2008). The science projects involving NEMO on the Grid have a particular focus on data assimilation (Smith et al. 2008), a technique that involves constraining model simulations with observations. The POLCOMS model will play an important part in the GCOMS project (Holt et al, 2008), which aims to simulate the world’s coastal oceans. A typical use of G-Rex by a scientist to run a climate model on the NERC Cluster Grid proceeds as follows :(1) The scientist prepares input files on his or her local machine. (2) Using information provided by the Grid’s Ganglia3 monitoring system, the scientist selects an appropriate compute resource. (3) The scientist runs the relevant workflow script on his or her local machine. This is unmodified except that calls to run the model (e.g. with “mpirun”) are simply replaced with calls to "GRexRun" (4) The G-Rex middleware automatically handles the uploading of input files to the remote resource, and the downloading of output files back to the user, including their deletion from the remote system, during the run. (5) The scientist monitors the output files, using familiar analysis and visualization tools on his or her own local machine. G-Rex is well suited to climate modelling because it addresses many of the middleware usability issues that have led to limited uptake of grid computing by climate scientists. It is a lightweight, low-impact and easy-to-install solution that is currently designed for use in relatively small grids such as the NERC Cluster Grid. A current topic of research is the use of G-Rex as an easy-to-use front-end to larger-scale Grid resources such as the UK National Grid service.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The performance of a 2D numerical model of flood hydraulics is tested for a major event in Carlisle, UK, in 2005. This event is associated with a unique data set, with GPS surveyed wrack lines and flood extent surveyed 3 weeks after the flood. The Simple Finite Volume (SFV) model is used to solve the 2D Saint-Venant equations over an unstructured mesh of 30000 elements representing channel and floodplain, and allowing detailed hydraulics of flow around bridge piers and other influential features to be represented. The SFV model is also used to corroborate flows recorded for the event at two gauging stations. Calibration of Manning's n is performed with a two stage strategy, with channel values determined by calibration of the gauging station models, and floodplain values determined by optimising the fit between model results and observed water levels and flood extent for the 2005 event. RMS error for the calibrated model compared with surveyed water levels is ~±0.4m, the same order of magnitude as the estimated error in the survey data. The study demonstrates the ability of unstructured mesh hydraulic models to represent important hydraulic processes across a range of scales, with potential applications to flood risk management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The particle size distributions of surface soils from two cultivated silty fields (Moorfield and Railway South) in Herefordshire, UK, were assessed by sampling on 20-m grids across the fields. Moorfield (8 ha) had a uniform landscape sloping mainly in a North-South direction while Railway South (12 ha) had complex undulating landscape characteristics. Samples from 3 surficial layers were also taken at 3 landscape positions at Moorfield to investigate recent (within-season) soil particle redistribution. Size fractions were determined using chemical dispersion, wet sieving (to separate the sand fractions) and laser gramilometry (for the finer fractions). The distribution of various fractions and the relationships between elevation and the various fractions suggest preferential detachment and movement of coarse to very coarse silt fractions (16-63 mu m), which were found mostly at downslope or depositional areas. Upper slope samples had higher clay to fine silt (< 16 mu m) contents than bottom slope samples. The upslope-downslope patterns of size fractions, particularly on uniformly sloping areas, of the 2 fields were similar and their deposited sediments were dominated by coarse silt fractions. Samples from 3 landscape positions at Moorfield became coarser from the less eroded summit, through the eroding side-slope to the bottom-slope depositional area. Within each of these landscape positions the top 0-2.5 cm layers were more enriched in coarse silt fractions than the bottom layers. The spatial patterns of soil particle size distributions in the 2 fields may be a result of sediment detachment and deposition caused by water erosion and tillage operations. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In molecular biology, it is often desirable to find common properties in large numbers of drug candidates. One family of methods stems from the data mining community, where algorithms to find frequent graphs have received increasing attention over the past years. However, the computational complexity of the underlying problem and the large amount of data to be explored essentially render sequential algorithms useless. In this paper, we present a distributed approach to the frequent subgraph mining problem to discover interesting patterns in molecular compounds. This problem is characterized by a highly irregular search tree, whereby no reliable workload prediction is available. We describe the three main aspects of the proposed distributed algorithm, namely, a dynamic partitioning of the search space, a distribution process based on a peer-to-peer communication framework, and a novel receiverinitiated load balancing algorithm. The effectiveness of the distributed method has been evaluated on the well-known National Cancer Institute’s HIV-screening data set, where we were able to show close-to linear speedup in a network of workstations. The proposed approach also allows for dynamic resource aggregation in a non dedicated computational environment. These features make it suitable for large-scale, multi-domain, heterogeneous environments, such as computational grids.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we present a distributed computing framework for problems characterized by a highly irregular search tree, whereby no reliable workload prediction is available. The framework is based on a peer-to-peer computing environment and dynamic load balancing. The system allows for dynamic resource aggregation, does not depend on any specific meta-computing middleware and is suitable for large-scale, multi-domain, heterogeneous environments, such as computational Grids. Dynamic load balancing policies based on global statistics are known to provide optimal load balancing performance, while randomized techniques provide high scalability. The proposed method combines both advantages and adopts distributed job-pools and a randomized polling technique. The framework has been successfully adopted in a parallel search algorithm for subgraph mining and evaluated on a molecular compounds dataset. The parallel application has shown good calability and close-to linear speedup in a distributed network of workstations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The P-1-P-1 finite element pair is known to allow the existence of spurious pressure (surface elevation) modes for the shallow water equations and to be unstable for mixed formulations. We show that this behavior is strongly influenced by the strong or the weak enforcement of the impermeability boundary conditions. A numerical analysis of the Stommel model is performed for both P-1-P-1 and P-1(NC)-P-1 mixed formulations. Steady and transient test cases are considered. We observe that the P-1-P-1 element exhibits stable discrete solutions with weak boundary conditions or with fully unstructured meshes. (c) 2005 Elsevier Ltd. All rights reserved.