902 resultados para Unstructured Grids
Resumo:
Fluid that fills boreholes in crosswell electrical resistivity investigations provides the necessary electrical contact between the electrodes and the rock formation but it is also the source of image artifacts in standard inversions that do not account for the effects of the boreholes. The image distortions can be severe for large resistivity contrasts between the rock formation and borehole fluid and for large borehole diameters. We have carried out 3D finite-element modeling using an unstructured-grid approach to quantify the magnitude of borehole effects for different resistivity contrasts, borehole diameters, and electrode configurations. Relatively common resistivity contrasts of 100:1 and borehole diameters of 10 and 20 cm yielded, for a bipole length of 5 m, apparent resistivity underestimates of approximately 12% and 32% when using AB-MN configurations and apparent resistivity overestimates of approximately 24% and 95% when using AM-BN configurations. Effects are generally more severe at shorter bipole spacings. We report the results obtained by either including or ignoring the boreholes in inversions of 3D field data from a test site in Switzerland, where approximately 10,000 crosswell resistivity-tomography measurements were made across six acquisition planes among four boreholes. Inversions of raw data that ignored the boreholes filled with low-resistivity fluid paradoxically produced high-resistivity artifacts around the boreholes. Including correction factors based on the modeling results fora ID model with and without the boreholes did not markedly improve the images. The only satisfactory approach was to use a 3D inversion code that explicitly incorporated the boreholes in the actual inversion. This new approach yielded an electrical resistivity image that was devoid of artifacts around the boreholes and that correlated well with coincident crosswell radar images.
Resumo:
In a genome-wide screen for alpha-helical coiled coil motifs aiming at structurally defined vaccine candidates we identified PFF0165c. This protein is exported in the trophozoite stage and was named accordingly Trophozoite exported protein 1 (Tex1). In an extensive preclinical evaluation of its coiled coil peptides Tex1 was identified as promising novel malaria vaccine candidate providing the rational for a comprehensive cell biological characterization of Tex1. Antibodies generated against an intrinsically unstructured N-terminal region of Tex1 and against a coiled coil domain were used to investigate cytological localization, solubility and expression profile. Co-localization experiments revealed that Tex1 is exported across the parasitophorous vacuole membrane and located to Maurer's clefts. Change in location is accompanied by a change in solubility: from a soluble state within the parasite to a membrane-associated state after export to Maurer's clefts. No classical export motifs such as PEXEL, signal sequence/anchor or transmembrane domain was identified for Tex1.
Resumo:
The tourism consumer’s purchase decision process is, to a great extent, conditioned by the image the tourist has of the different destinations that make up his or her choice set. In a highly competitive international tourist market, those responsible for destinations’ promotion and development policies seek differentiation strategies so that they may position the destinations in the most suitable market segments for their product in order to improve their attractiveness to visitors and increase or consolidate the economic benefits that tourism activity generates in their territory. To this end, the main objective we set ourselves in this paper is the empirical analysis of the factors that determine the image formation of Tarragona city as a cultural heritage destination. Without a doubt, UNESCO’s declaration of Tarragona’s artistic and monumental legacies as World Heritage site in the year 2000 meant important international recognition of the quality of the cultural and patrimonial elements offered by the city to the visitors who choose it as a tourist destination. It also represents a strategic opportunity to boost the city’s promotion of tourism and its consolidation as a unique destination given its cultural and patrimonial characteristics. Our work is based on the use of structured and unstructured techniques to identify the factors that determine Tarragona’s tourist destination image and that have a decisive influence on visitors’ process of choice of destination. In addition to being able to ascertain Tarragona’s global tourist image, we consider that the heterogeneity of its visitors requires a more detailed study that enables us to segment visitor typology. We consider that the information provided by these results may prove of great interest to those responsible for local tourism policy, both when designing products and when promoting the destination.
Resumo:
In a genome-wide screen for alpha-helical coiled coil motifs aiming at structurally defined vaccine candidates we identified PFF0165c. This protein is exported in the trophozoite stage and was named accordingly Trophozoite exported protein 1 (Tex1). In an extensive preclinical evaluation of its coiled coil peptides Tex1 was identified as promising novel malaria vaccine candidate providing the rational for a comprehensive cell biological characterization of Tex1. Antibodies generated against an intrinsically unstructured N-terminal region of Tex1 and against a coiled coil domain were used to investigate cytological localization, solubility and expression profile. Co-localization experiments revealed that Tex1 is exported across the parasitophorous vacuole membrane and located to Maurer's clefts. Change in location is accompanied by a change in solubility: from a soluble state within the parasite to a membrane-associated state after export to Maurer's clefts. No classical export motifs such as PEXEL, signal sequence/anchor or transmembrane domain was identified for Tex1.
Resumo:
Customer Experience Management (CEM) se ha convertido en un factor clave para el éxito de las empresas. CEM gestiona todas las experiencias que un cliente tiene con un proveedor de servicios o productos. Es muy importante saber como se siente un cliente en cada contacto y entonces poder sugerir automáticamente la próxima tarea a realizar, simplificando tareas realizadas por personas. En este proyecto se desarrolla una solución para evaluar experiencias. Primero se crean servicios web que clasifican experiencias en estados emocionales dependiendo del nivel de satisfacción, interés, … Esto es realizado a través de minería de textos. Se procesa y clasifica información no estructurada (documentos de texto) que representan o describen las experiencias. Se utilizan métodos de aprendizaje supervisado. Esta parte es desarrollada con una arquitectura orientada a servicios (SOA) para asegurar el uso de estándares y que los servicios sean accesibles por cualquier aplicación. Estos servicios son desplegados en un servidor de aplicaciones. En la segunda parte se desarrolla dos aplicaciones basadas en casos reales. En esta fase Cloud computing es clave. Se utiliza una plataforma de desarrollo en línea para crear toda la aplicación incluyendo tablas, objetos, lógica de negocio e interfaces de usuario. Finalmente los servicios de clasificación son integrados a la plataforma asegurando que las experiencias son evaluadas y que las tareas de seguimiento son automáticamente creadas.
Resumo:
This article uses a mixed methods design to investigate the effects of social influence on family formation in a sample of eastern and western German young adults at an early stage of their family formation. Theoretical propositions on the importance of informal interaction for fertility and family behavior are still rarely supported by systematic empirical evidence. Major problems are the correct identification of salient relationships and the comparability of social networks across population subgroups. This article addresses the two issues through a combination of qualitative and quantitative data collection and analysis. In-depth interviewing, network charts, and network grids are used to map individual personal relationships and their influence on family formation decisions. In addition, an analysis of friendship dyads is provided.
Resumo:
This article presents recent WMR (wheeled mobile robot) navigation experiences using local perception knowledge provided by monocular and odometer systems. A local narrow perception horizon is used to plan safety trajectories towards the objective. Therefore, monocular data are proposed as a way to obtain real time local information by building two dimensional occupancy grids through a time integration of the frames. The path planning is accomplished by using attraction potential fields, while the trajectory tracking is performed by using model predictive control techniques. The results are faced to indoor situations by using the lab available platform consisting in a differential driven mobile robot
Resumo:
Drug addiction is associated with impaired judgment in unstructured situations in which success depends on self-regulation of behavior according to internal goals (adaptive decision-making). However most executive measures are aimed at assessing decision-making in structured scenarios, in which success is determined by external criteria inherent to the situation (veridical decision-making). The aim of this study was to examine the performance of Substance Abusers (SA, n = 97) and Healthy Comparison participants (HC, n = 81) in two behavioral tasks that mimic the uncertainty inherent in real-life decision-making: the Cognitive Bias Task (CB) and the Iowa Gambling Task (IGT) (administered only to SA). A related goal was to study the interdependence between performances on both tasks. We conducted univariate analyses of variance (ANOVAs) to contrast the decision-making performance of both groups; and used correlation analyses to study the relationship between both tasks. SA showed a marked context-independent decision-making strategy on the CB's adaptive condition, but no differences were found on the veridical conditions in a subsample of SA (n = 34) and HC (n = 22). A high percentage of SA (75%) also showed impaired performance on the IGT. Both tasks were only correlated when no impaired participants were selected. Results indicate that SA show abnormal decision-making performance in unstructured situations, but not in veridical situations.
Resumo:
Spatial data on species distributions are available in two main forms, point locations and distribution maps (polygon ranges and grids). The first are often temporally and spatially biased, and too discontinuous, to be useful (untransformed) in spatial analyses. A variety of modelling approaches are used to transform point locations into maps. We discuss the attributes that point location data and distribution maps must satisfy in order to be useful in conservation planning. We recommend that before point location data are used to produce and/or evaluate distribution models, the dataset should be assessed under a set of criteria, including sample size, age of data, environmental/geographical coverage, independence, accuracy, time relevance and (often forgotten) representation of areas of permanent and natural presence of the species. Distribution maps must satisfy additional attributes if used for conservation analyses and strategies, including minimizing commission and omission errors, credibility of the source/assessors and availability for public screening. We review currently available databases for mammals globally and show that they are highly variable in complying with these attributes. The heterogeneity and weakness of spatial data seriously constrain their utility to global and also sub-global scale conservation analyses.
Resumo:
Objectives: We are interested in the numerical simulation of the anastomotic region comprised between outflow canula of LVAD and the aorta. Segmenta¬tion, geometry reconstruction and grid generation from patient-specific data remain an issue because of the variable quality of DICOM images, in particular CT-scan (e.g. metallic noise of the device, non-aortic contrast phase). We pro¬pose a general framework to overcome this problem and create suitable grids for numerical simulations.Methods: Preliminary treatment of images is performed by reducing the level window and enhancing the contrast of the greyscale image using contrast-limited adaptive histogram equalization. A gradient anisotropic diffusion filter is applied to reduce the noise. Then, watershed segmentation algorithms and mathematical morphology filters allow reconstructing the patient geometry. This is done using the InsightToolKit library (www.itk.org). Finally the Vascular Model¬ing ToolKit (www.vmtk.org) and gmsh (www.geuz.org/gmsh) are used to create the meshes for the fluid (blood) and structure (arterial wall, outflow canula) and to a priori identify the boundary layers. The method is tested on five different patients with left ventricular assistance and who underwent a CT-scan exam.Results: This method produced good results in four patients. The anastomosis area is recovered and the generated grids are suitable for numerical simulations. In one patient the method failed to produce a good segmentation because of the small dimension of the aortic arch with respect to the image resolution.Conclusions: The described framework allows the use of data that could not be otherwise segmented by standard automatic segmentation tools. In particular the computational grids that have been generated are suitable for simulations that take into account fluid-structure interactions. Finally the presented method features a good reproducibility and fast application.
Resumo:
Implementación y evaluación de un algoritmo híbrido que selecciona el conjunto de nodos de menor coste que permite desplegar un servicio, con una disponibilidad determinada, en un entorno de computación voluntaria.
Resumo:
In this issue of Genes & Development, Revyakin and colleagues (pp. 1691-1702) measure the relation between individual RNA polymerase II transcription events and transcription factor assembly by counting RNA transcripts retained on the template DNA using single-molecule fluorescence.
Resumo:
One of the important questions in biological evolution is to know if certain changes along protein coding genes have contributed to the adaptation of species. This problem is known to be biologically complex and computationally very expensive. It, therefore, requires efficient Grid or cluster solutions to overcome the computational challenge. We have developed a Grid-enabled tool (gcodeml) that relies on the PAML (codeml) package to help analyse large phylogenetic datasets on both Grids and computational clusters. Although we report on results for gcodeml, our approach is applicable and customisable to related problems in biology or other scientific domains.
Resumo:
Abstract The object of game theory lies in the analysis of situations where different social actors have conflicting requirements and where their individual decisions will all influence the global outcome. In this framework, several games have been invented to capture the essence of various dilemmas encountered in many common important socio-economic situations. Even though these games often succeed in helping us understand human or animal behavior in interactive settings, some experiments have shown that people tend to cooperate with each other in situations for which classical game theory strongly recommends them to do the exact opposite. Several mechanisms have been invoked to try to explain the emergence of this unexpected cooperative attitude. Among them, repeated interaction, reputation, and belonging to a recognizable group have often been mentioned. However, the work of Nowak and May (1992) showed that the simple fact of arranging the players according to a spatial structure and only allowing them to interact with their immediate neighbors is sufficient to sustain a certain amount of cooperation even when the game is played anonymously and without repetition. Nowak and May's study and much of the following work was based on regular structures such as two-dimensional grids. Axelrod et al. (2002) showed that by randomizing the choice of neighbors, i.e. by actually giving up a strictly local geographical structure, cooperation can still emerge, provided that the interaction patterns remain stable in time. This is a first step towards a social network structure. However, following pioneering work by sociologists in the sixties such as that of Milgram (1967), in the last few years it has become apparent that many social and biological interaction networks, and even some technological networks, have particular, and partly unexpected, properties that set them apart from regular or random graphs. Among other things, they usually display broad degree distributions, and show small-world topological structure. Roughly speaking, a small-world graph is a network where any individual is relatively close, in terms of social ties, to any other individual, a property also found in random graphs but not in regular lattices. However, in contrast with random graphs, small-world networks also have a certain amount of local structure, as measured, for instance, by a quantity called the clustering coefficient. In the same vein, many real conflicting situations in economy and sociology are not well described neither by a fixed geographical position of the individuals in a regular lattice, nor by a random graph. Furthermore, it is a known fact that network structure can highly influence dynamical phenomena such as the way diseases spread across a population and ideas or information get transmitted. Therefore, in the last decade, research attention has naturally shifted from random and regular graphs towards better models of social interaction structures. The primary goal of this work is to discover whether or not the underlying graph structure of real social networks could give explanations as to why one finds higher levels of cooperation in populations of human beings or animals than what is prescribed by classical game theory. To meet this objective, I start by thoroughly studying a real scientific coauthorship network and showing how it differs from biological or technological networks using divers statistical measurements. Furthermore, I extract and describe its community structure taking into account the intensity of a collaboration. Finally, I investigate the temporal evolution of the network, from its inception to its state at the time of the study in 2006, suggesting also an effective view of it as opposed to a historical one. Thereafter, I combine evolutionary game theory with several network models along with the studied coauthorship network in order to highlight which specific network properties foster cooperation and shed some light on the various mechanisms responsible for the maintenance of this same cooperation. I point out the fact that, to resist defection, cooperators take advantage, whenever possible, of the degree-heterogeneity of social networks and their underlying community structure. Finally, I show that cooperation level and stability depend not only on the game played, but also on the evolutionary dynamic rules used and the individual payoff calculations. Synopsis Le but de la théorie des jeux réside dans l'analyse de situations dans lesquelles différents acteurs sociaux, avec des objectifs souvent conflictuels, doivent individuellement prendre des décisions qui influenceront toutes le résultat global. Dans ce cadre, plusieurs jeux ont été inventés afin de saisir l'essence de divers dilemmes rencontrés dans d'importantes situations socio-économiques. Bien que ces jeux nous permettent souvent de comprendre le comportement d'êtres humains ou d'animaux en interactions, des expériences ont montré que les individus ont parfois tendance à coopérer dans des situations pour lesquelles la théorie classique des jeux prescrit de faire le contraire. Plusieurs mécanismes ont été invoqués pour tenter d'expliquer l'émergence de ce comportement coopératif inattendu. Parmi ceux-ci, la répétition des interactions, la réputation ou encore l'appartenance à des groupes reconnaissables ont souvent été mentionnés. Toutefois, les travaux de Nowak et May (1992) ont montré que le simple fait de disposer les joueurs selon une structure spatiale en leur permettant d'interagir uniquement avec leurs voisins directs est suffisant pour maintenir un certain niveau de coopération même si le jeu est joué de manière anonyme et sans répétitions. L'étude de Nowak et May, ainsi qu'un nombre substantiel de travaux qui ont suivi, étaient basés sur des structures régulières telles que des grilles à deux dimensions. Axelrod et al. (2002) ont montré qu'en randomisant le choix des voisins, i.e. en abandonnant une localisation géographique stricte, la coopération peut malgré tout émerger, pour autant que les schémas d'interactions restent stables au cours du temps. Ceci est un premier pas en direction d'une structure de réseau social. Toutefois, suite aux travaux précurseurs de sociologues des années soixante, tels que ceux de Milgram (1967), il est devenu clair ces dernières années qu'une grande partie des réseaux d'interactions sociaux et biologiques, et même quelques réseaux technologiques, possèdent des propriétés particulières, et partiellement inattendues, qui les distinguent de graphes réguliers ou aléatoires. Entre autres, ils affichent en général une distribution du degré relativement large ainsi qu'une structure de "petit-monde". Grossièrement parlant, un graphe "petit-monde" est un réseau où tout individu se trouve relativement près de tout autre individu en termes de distance sociale, une propriété également présente dans les graphes aléatoires mais absente des grilles régulières. Par contre, les réseaux "petit-monde" ont, contrairement aux graphes aléatoires, une certaine structure de localité, mesurée par exemple par une quantité appelée le "coefficient de clustering". Dans le même esprit, plusieurs situations réelles de conflit en économie et sociologie ne sont pas bien décrites ni par des positions géographiquement fixes des individus en grilles régulières, ni par des graphes aléatoires. De plus, il est bien connu que la structure même d'un réseau peut passablement influencer des phénomènes dynamiques tels que la manière qu'a une maladie de se répandre à travers une population, ou encore la façon dont des idées ou une information s'y propagent. Ainsi, durant cette dernière décennie, l'attention de la recherche s'est tout naturellement déplacée des graphes aléatoires et réguliers vers de meilleurs modèles de structure d'interactions sociales. L'objectif principal de ce travail est de découvrir si la structure sous-jacente de graphe de vrais réseaux sociaux peut fournir des explications quant aux raisons pour lesquelles on trouve, chez certains groupes d'êtres humains ou d'animaux, des niveaux de coopération supérieurs à ce qui est prescrit par la théorie classique des jeux. Dans l'optique d'atteindre ce but, je commence par étudier un véritable réseau de collaborations scientifiques et, en utilisant diverses mesures statistiques, je mets en évidence la manière dont il diffère de réseaux biologiques ou technologiques. De plus, j'extrais et je décris sa structure de communautés en tenant compte de l'intensité d'une collaboration. Finalement, j'examine l'évolution temporelle du réseau depuis son origine jusqu'à son état en 2006, date à laquelle l'étude a été effectuée, en suggérant également une vue effective du réseau par opposition à une vue historique. Par la suite, je combine la théorie évolutionnaire des jeux avec des réseaux comprenant plusieurs modèles et le réseau de collaboration susmentionné, afin de déterminer les propriétés structurelles utiles à la promotion de la coopération et les mécanismes responsables du maintien de celle-ci. Je mets en évidence le fait que, pour ne pas succomber à la défection, les coopérateurs exploitent dans la mesure du possible l'hétérogénéité des réseaux sociaux en termes de degré ainsi que la structure de communautés sous-jacente de ces mêmes réseaux. Finalement, je montre que le niveau de coopération et sa stabilité dépendent non seulement du jeu joué, mais aussi des règles de la dynamique évolutionnaire utilisées et du calcul du bénéfice d'un individu.
Resumo:
OBJECTIVETo compare the total sleep time of premature infant in the presence or absence of reducing sensory and environmental stimuli in the neonatal unit.METHODLongitudinal study in a Neonatal Intermediate Care Unit of a public hospital in Sao Paulo. The sample consisted of 13 premature infants. We used polysomnograph and unstructured observation for data collection. We analyzed 240 and 1200 minutes corresponding to the periods of the presence and absence of environmental management, respectively. Data were compared in proportion to the total sleep time in the two moments proposed by the study.RESULTSThe total sleep time in periods without environmental management was on average 696.4 (± 112.1) minutes and with management 168.5 (± 27.9) minutes, proportionally premature infant slept an average of 70.2% during periods with no intervention and 58.0% without management (p=0.002).CONCLUSIONReducing stimulation and handling of premature infant environment periods was effective to provide greater total sleep time.