897 resultados para Dominic Interactive
Resumo:
A major challenge in this era of rapid climate change is to predict changes in species distributions and their impacts on ecosystems, and, if necessary, to recommend management strategies for maintenance of biodiversity or ecosystem services. Biological invasions, studied in most biomes of the world, can provide useful analogs for some of the ecological consequences of species distribution shifts in response to climate change. Invasions illustrate the adaptive and interactive responses that can occur when species are confronted with new environmental conditions. Invasion ecology complements climate change research and provides insights into the following questions: i) how will species distributions respond to climate change? ii) how will species movement affect recipient ecosystems? and iii) should we, and if so how can we, manage species and ecosystems in the face of climate change? Invasion ecology demonstrates that a trait-based approach can help to predict spread speeds and impacts on ecosystems, and has the potential to predict climate change impacts on species ranges and recipient ecosystems. However, there is a need to analyse traits in the context of life-history and demography, the stage in the colonisation process (e.g., spread, establishment or impact), the distribution of suitable habitats in the landscape, and the novel abiotic and biotic conditions under which those traits are expressed. As is the case with climate change, invasion ecology is embedded within complex societal goals. Both disciplines converge on similar questions of "when to intervene?" and "what to do?" which call for a better understanding of the ecological processes and social values associated with changing ecosystems.
Resumo:
This thesis concerns organizing a workshop about interaction in the various communities represented by Helsinki city's social welfare department. There were seventeen workshops altogether and they were organized in different communities; for example, in a children's daycare centre. The aim was to gain experience in the planning and organizing of these kinds of workshop. The workshops focused on dealing with interactive quetions arising out of the very community which was taking part in the workshop. These questions were discussed and handled using a technique called forum-statues. This means that the problems arising from the community were presented as living pictures to the group. There was also a short theoretical element concerning interactions between people, self-esteem, the different phases in developing a group, and the effects of conflicts in groups. There was a high degree of interest in the research and places were soon filled. The workshops consisted of a warming part with a lot of playing, a deepening part with questions arising out of the group, and a relaxation and feedback part at the end. The basis of the workshop was similar for all seventeen workshops. The athmosphere in workshops was discussive and open. The participants were engouraged to express there opinions and point of views. Feedback from the participantswas very positive. The participants obtained new points of view, according there fellow workers, and the community spirit improved. Shortage of time was, unfortunately, a problem. With more time, it would have been possible to go deeper into the problems of interaction within the community. Ceratinly, the research proved that there would be great demand for this kind of workshop in future.
Resumo:
Abstract The object of game theory lies in the analysis of situations where different social actors have conflicting requirements and where their individual decisions will all influence the global outcome. In this framework, several games have been invented to capture the essence of various dilemmas encountered in many common important socio-economic situations. Even though these games often succeed in helping us understand human or animal behavior in interactive settings, some experiments have shown that people tend to cooperate with each other in situations for which classical game theory strongly recommends them to do the exact opposite. Several mechanisms have been invoked to try to explain the emergence of this unexpected cooperative attitude. Among them, repeated interaction, reputation, and belonging to a recognizable group have often been mentioned. However, the work of Nowak and May (1992) showed that the simple fact of arranging the players according to a spatial structure and only allowing them to interact with their immediate neighbors is sufficient to sustain a certain amount of cooperation even when the game is played anonymously and without repetition. Nowak and May's study and much of the following work was based on regular structures such as two-dimensional grids. Axelrod et al. (2002) showed that by randomizing the choice of neighbors, i.e. by actually giving up a strictly local geographical structure, cooperation can still emerge, provided that the interaction patterns remain stable in time. This is a first step towards a social network structure. However, following pioneering work by sociologists in the sixties such as that of Milgram (1967), in the last few years it has become apparent that many social and biological interaction networks, and even some technological networks, have particular, and partly unexpected, properties that set them apart from regular or random graphs. Among other things, they usually display broad degree distributions, and show small-world topological structure. Roughly speaking, a small-world graph is a network where any individual is relatively close, in terms of social ties, to any other individual, a property also found in random graphs but not in regular lattices. However, in contrast with random graphs, small-world networks also have a certain amount of local structure, as measured, for instance, by a quantity called the clustering coefficient. In the same vein, many real conflicting situations in economy and sociology are not well described neither by a fixed geographical position of the individuals in a regular lattice, nor by a random graph. Furthermore, it is a known fact that network structure can highly influence dynamical phenomena such as the way diseases spread across a population and ideas or information get transmitted. Therefore, in the last decade, research attention has naturally shifted from random and regular graphs towards better models of social interaction structures. The primary goal of this work is to discover whether or not the underlying graph structure of real social networks could give explanations as to why one finds higher levels of cooperation in populations of human beings or animals than what is prescribed by classical game theory. To meet this objective, I start by thoroughly studying a real scientific coauthorship network and showing how it differs from biological or technological networks using divers statistical measurements. Furthermore, I extract and describe its community structure taking into account the intensity of a collaboration. Finally, I investigate the temporal evolution of the network, from its inception to its state at the time of the study in 2006, suggesting also an effective view of it as opposed to a historical one. Thereafter, I combine evolutionary game theory with several network models along with the studied coauthorship network in order to highlight which specific network properties foster cooperation and shed some light on the various mechanisms responsible for the maintenance of this same cooperation. I point out the fact that, to resist defection, cooperators take advantage, whenever possible, of the degree-heterogeneity of social networks and their underlying community structure. Finally, I show that cooperation level and stability depend not only on the game played, but also on the evolutionary dynamic rules used and the individual payoff calculations. Synopsis Le but de la théorie des jeux réside dans l'analyse de situations dans lesquelles différents acteurs sociaux, avec des objectifs souvent conflictuels, doivent individuellement prendre des décisions qui influenceront toutes le résultat global. Dans ce cadre, plusieurs jeux ont été inventés afin de saisir l'essence de divers dilemmes rencontrés dans d'importantes situations socio-économiques. Bien que ces jeux nous permettent souvent de comprendre le comportement d'êtres humains ou d'animaux en interactions, des expériences ont montré que les individus ont parfois tendance à coopérer dans des situations pour lesquelles la théorie classique des jeux prescrit de faire le contraire. Plusieurs mécanismes ont été invoqués pour tenter d'expliquer l'émergence de ce comportement coopératif inattendu. Parmi ceux-ci, la répétition des interactions, la réputation ou encore l'appartenance à des groupes reconnaissables ont souvent été mentionnés. Toutefois, les travaux de Nowak et May (1992) ont montré que le simple fait de disposer les joueurs selon une structure spatiale en leur permettant d'interagir uniquement avec leurs voisins directs est suffisant pour maintenir un certain niveau de coopération même si le jeu est joué de manière anonyme et sans répétitions. L'étude de Nowak et May, ainsi qu'un nombre substantiel de travaux qui ont suivi, étaient basés sur des structures régulières telles que des grilles à deux dimensions. Axelrod et al. (2002) ont montré qu'en randomisant le choix des voisins, i.e. en abandonnant une localisation géographique stricte, la coopération peut malgré tout émerger, pour autant que les schémas d'interactions restent stables au cours du temps. Ceci est un premier pas en direction d'une structure de réseau social. Toutefois, suite aux travaux précurseurs de sociologues des années soixante, tels que ceux de Milgram (1967), il est devenu clair ces dernières années qu'une grande partie des réseaux d'interactions sociaux et biologiques, et même quelques réseaux technologiques, possèdent des propriétés particulières, et partiellement inattendues, qui les distinguent de graphes réguliers ou aléatoires. Entre autres, ils affichent en général une distribution du degré relativement large ainsi qu'une structure de "petit-monde". Grossièrement parlant, un graphe "petit-monde" est un réseau où tout individu se trouve relativement près de tout autre individu en termes de distance sociale, une propriété également présente dans les graphes aléatoires mais absente des grilles régulières. Par contre, les réseaux "petit-monde" ont, contrairement aux graphes aléatoires, une certaine structure de localité, mesurée par exemple par une quantité appelée le "coefficient de clustering". Dans le même esprit, plusieurs situations réelles de conflit en économie et sociologie ne sont pas bien décrites ni par des positions géographiquement fixes des individus en grilles régulières, ni par des graphes aléatoires. De plus, il est bien connu que la structure même d'un réseau peut passablement influencer des phénomènes dynamiques tels que la manière qu'a une maladie de se répandre à travers une population, ou encore la façon dont des idées ou une information s'y propagent. Ainsi, durant cette dernière décennie, l'attention de la recherche s'est tout naturellement déplacée des graphes aléatoires et réguliers vers de meilleurs modèles de structure d'interactions sociales. L'objectif principal de ce travail est de découvrir si la structure sous-jacente de graphe de vrais réseaux sociaux peut fournir des explications quant aux raisons pour lesquelles on trouve, chez certains groupes d'êtres humains ou d'animaux, des niveaux de coopération supérieurs à ce qui est prescrit par la théorie classique des jeux. Dans l'optique d'atteindre ce but, je commence par étudier un véritable réseau de collaborations scientifiques et, en utilisant diverses mesures statistiques, je mets en évidence la manière dont il diffère de réseaux biologiques ou technologiques. De plus, j'extrais et je décris sa structure de communautés en tenant compte de l'intensité d'une collaboration. Finalement, j'examine l'évolution temporelle du réseau depuis son origine jusqu'à son état en 2006, date à laquelle l'étude a été effectuée, en suggérant également une vue effective du réseau par opposition à une vue historique. Par la suite, je combine la théorie évolutionnaire des jeux avec des réseaux comprenant plusieurs modèles et le réseau de collaboration susmentionné, afin de déterminer les propriétés structurelles utiles à la promotion de la coopération et les mécanismes responsables du maintien de celle-ci. Je mets en évidence le fait que, pour ne pas succomber à la défection, les coopérateurs exploitent dans la mesure du possible l'hétérogénéité des réseaux sociaux en termes de degré ainsi que la structure de communautés sous-jacente de ces mêmes réseaux. Finalement, je montre que le niveau de coopération et sa stabilité dépendent non seulement du jeu joué, mais aussi des règles de la dynamique évolutionnaire utilisées et du calcul du bénéfice d'un individu.
Resumo:
This dissertation focuses on the strategies consumers use when making purchase decisions. It is organized in two main parts, one centering on descriptive and the other on applied decision making research. In the first part, a new process tracing tool called InterActive Process Tracing (IAPT) is pre- sented, which I developed to investigate the nature of consumers' decision strategies. This tool is a combination of several process tracing techniques, namely Active Information Search, Mouselab, and retrospective verbal protocol. To validate IAPT, two experiments on mobile phone purchase de- cisions were conducted where participants first repeatedly chose a mobile phone and then were asked to formalize their decision strategy so that it could be used to make choices for them. The choices made by the identified strategies correctly predicted the observed choices in 73% (Experiment 1) and 67% (Experiment 2) of the cases. Moreover, in Experiment 2, Mouselab and eye tracking were directly compared with respect to their impact on information search and strategy description. Only minor differences were found between these two methods. I conclude that IAPT is a useful research tool to identify choice strategies, and that using eye tracking technology did not increase its validity beyond that gained with Mouselab. In the second part, a prototype of a decision aid is introduced that was developed building in particular on the knowledge about consumers' decision strategies gained in Part I. This decision aid, which is called the InterActive Choice Aid (IACA), systematically assists consumers in their purchase decisions. To evaluate the prototype regarding its perceived utility, an experiment was conducted where IACA was compared to two other prototypes that were based on real-world consumer decision aids. All three prototypes differed in the number and type of tools they provided to facilitate the process of choosing, ranging from low (Amazon) to medium (Sunrise/dpreview) to high functionality (IACA). Overall, participants slightly preferred the prototype of medium functionality and this prototype was also rated best on the dimensions of understandability and ease of use. IACA was rated best regarding the two dimensions of ease of elimination and ease of comparison of alternatives. Moreover, participants choices were more in line with the normatively oriented weighted additive strategy when they used IACA than when they used the medium functionality prototype. The low functionality prototype was the least preferred overall. It is concluded that consumers can and will benefit from highly functional decision aids like IACA, but only when these systems are easy to understand and to use.
Resumo:
OBJECTIVES: The reconstruction of the right ventricular outflow tract (RVOT) with valved conduits remains a challenge. The reoperation rate at 5 years can be as high as 25% and depends on age, type of conduit, conduit diameter and principal heart malformation. The aim of this study is to provide a bench model with computer fluid dynamics to analyse the haemodynamics of the RVOT, pulmonary artery, its bifurcation, and left and right pulmonary arteries that in the future may serve as a tool for analysis and prediction of outcome following RVOT reconstruction. METHODS: Pressure, flow and diameter at the RVOT, pulmonary artery, bifurcation of the pulmonary artery, and left and right pulmonary arteries were measured in five normal pigs with a mean weight of 24.6 ± 0.89 kg. Data obtained were used for a 3D computer fluid-dynamics simulation of flow conditions, focusing on the pressure, flow and shear stress profile of the pulmonary trunk to the level of the left and right pulmonary arteries. RESULTS: Three inlet steady flow profiles were obtained at 0.2, 0.29 and 0.36 m/s that correspond to the flow rates of 1.5, 2.0 and 2.5 l/min flow at the RVOT. The flow velocity profile was constant at the RVOT down to the bifurcation and decreased at the left and right pulmonary arteries. In all three inlet velocity profiles, low sheer stress and low-velocity areas were detected along the left wall of the pulmonary artery, at the pulmonary artery bifurcation and at the ostia of both pulmonary arteries. CONCLUSIONS: This computed fluid real-time model provides us with a realistic picture of fluid dynamics in the pulmonary tract area. Deep shear stress areas correspond to a turbulent flow profile that is a predictive factor for the development of vessel wall arteriosclerosis. We believe that this bench model may be a useful tool for further evaluation of RVOT pathology following surgical reconstructions.
Resumo:
This paper identifies selected issues and lessons learned from the implementation of a national program of prevention and control of non-communicable diseases (NCD) during the past 20 years in the Seychelles, a small island state in the African region. As early as in 1989, population-based surveys demonstrated high levels of several cardiovascular risk factors, which prompted an organized response by the government. The early creation of a NCD unit within the Ministry of Health, coupled with cooperation with international partners, enabled incremental capacity building and coherent development of NCD programs and policy. Information campaigns and screening for hypertension and diabetes in work/public places raised awareness and rallied increasingly broad awareness and support to NCD prevention and control. A variety of interventions were organized for tobacco control and comprehensive tobacco control legislation was enacted in 2009 (including total bans on tobacco advertising and on smoking in all enclosed public and work places). A recent School Nutrition Policy prohibits the sale of soft drinks in schools. At primary health care level, guidelines were developed for the management of hypertension and diabetes (these conditions are managed in all health centers within a national health system); regular interactive education sessions were organized for groups of high risk patients ("heart health club"); and specialized "NCD nurses" were trained. Decreasing prevalence of smoking is evidence of success, but the raising "diabesity epidemic" calls for strengthened health care to high-risk patients and broader multisectoral policy to mould an environment conducive to healthy behaviors. Key components of NCD prevention and control in Seychelles include effective surveillance mechanisms supplemented by focused research; generating broad interest and consensus on the need for prevention and control of NCD; mobilizing leadership and commitment at all levels; involving local and international expertise; building on existing efforts; and seeking integrated, multi-disciplinary and multisectoral approaches.
Resumo:
This paper presents a pilot project to reinforce participatory practices in standardization. The INTERNORM project is funded by the University of Lausanne, Switzerland. It aims to create an interactive knowledge center based on the sharing of academic skills and the experiences accumulated by the civil society, especially consumer associations, environmental associations and trade unions to strengthen the participatory process of standardization. The first objective of the project is action-oriented: INTERNORM provides a common knowledge pool supporting the participation of civil society actors to international standard-setting activities by bringing them together with academic experts in working groups and by providing logistic and financial support to their participation to meetings of national and international technical committees. The second objective of the project is analytical: the standardization action initiated through INTERNORM provides a research field for a better understanding of the participatory dynamics underpinning international standardization. The paper presents three incentives that explain civil society (non-)involvement in standardization that try to overcome conventional resource-based hypotheses: an operational incentive, related to the use of standards in the selective goods provided by associations to their membership; a thematic incentive, provided by the setting of priorities by strategic committees created in some standardization organization; a rhetorical incentive, related to the discursive resource that civil society concerns offers to the different stakeholders.
Resumo:
The Microbe browser is a web server providing comparative microbial genomics data. It offers comprehensive, integrated data from GenBank, RefSeq, UniProt, InterPro, Gene Ontology and the Orthologs Matrix Project (OMA) database, displayed along with gene predictions from five software packages. The Microbe browser is daily updated from the source databases and includes all completely sequenced bacterial and archaeal genomes. The data are displayed in an easy-to-use, interactive website based on Ensembl software. The Microbe browser is available at http://microbe.vital-it.ch/. Programmatic access is available through the OMA application programming interface (API) at http://microbe.vital-it.ch/api.
Resumo:
"Beauty-contest" is a game in which participants have to choose, typically, a number in [0,100], the winner being the person whose number is closest to a proportion of the average of all chosen numbers. We describe and analyze Beauty-contest experiments run in newspapers in UK, Spain, and Germany and find stable patterns of behavior across them, despite the uncontrollability of these experiments. These results are then compared with lab experiments involving undergraduates and game theorists as subjects, in what must be one of the largest empirical corroborations of interactive behavior ever tried. We claim that all observed behavior, across a wide variety of treatments and subject pools, can be interpretedas iterative reasoning. Level-1 reasoning, Level-2 reasoning and Level-3 reasoning are commonly observed in all the samples, while the equilibrium choice (Level-Maximum reasoning) is only prominently chosen by newspaper readers and theorists. The results show the empirical power of experiments run with large subject-pools, and open the door for more experimental work performed on the rich platform offered by newspapers and magazines.
Resumo:
Com características morfológicas e edafo-climáticas extremamente diversificadas, a ilha de Santo Antão em Cabo Verde apresenta uma reconhecida vulnerabilidade ambiental a par de uma elevada carência de estudos científicos que incidam sobre essa realidade e sirvam de base à uma compreensão integrada dos fenómenos. A cartografia digital e as tecnologias de informação geográfica vêm proporcionando um avanço tecnológico na colecção, armazenamento e processamento de dados espaciais. Várias ferramentas actualmente disponíveis permitem modelar uma multiplicidade de factores, localizar e quantificar os fenómenos bem como e definir os níveis de contribuição de diferentes factores no resultado final. No presente estudo, desenvolvido no âmbito do curso de pós-graduação e mestrado em sistemas de Informação geográfica realizado pela Universidade de Trás-os-Montes e Alto Douro, pretende-se contribuir para a minimização do deficit de informação relativa às características biofísicas da citada ilha, recorrendo-se à aplicação de tecnologias de informação geográfica e detecção remota, associadas à análise estatística multivariada. Nesse âmbito, foram produzidas e analisadas cartas temáticas e desenvolvido um modelo de análise integrada de dados. Com efeito, a multiplicidade de variáveis espaciais produzidas, de entre elas 29 variáveis com variação contínua passíveis de influenciar as características biofísicas da região e, possíveis ocorrências de efeitos mútuos antagónicos ou sinergéticos, condicionam uma relativa complexidade à interpretação a partir dos dados originais. Visando contornar este problema, recorre-se a uma rede de amostragem sistemática, totalizando 921 pontos ou repetições, para extrair os dados correspondentes às 29 variáveis nos pontos de amostragem e, subsequente desenvolvimento de técnicas de análise estatística multivariada, nomeadamente a análise em componentes principais. A aplicação destas técnicas permitiu simplificar e interpretar as variáreis originais, normalizando-as e resumindo a informação contida na diversidade de variáveis originais, correlacionadas entre si, num conjunto de variáveis ortogonais (não correlacionadas), e com níveis de importância decrescente, as componentes principais. Fixou-se como meta a concentração de 75% da variância dos dados originais explicadas pelas primeiras 3 componentes principais e, desenvolveu-se um processo interactivo em diferentes etapas, eliminando sucessivamente as variáveis menos representativas. Na última etapa do processo as 3 primeiras CP resultaram em 74,54% da variância dos dados originais explicadas mas, que vieram a demonstrar na fase posterior, serem insuficientes para retratar a realidade. Optou-se pela inclusão da 4ª CP (CP4), com a qual 84% da referida variância era explicada e, representando oito variáveis biofísicas: a altitude, a densidade hidrográfica, a densidade de fracturação geológica, a precipitação, o índice de vegetação, a temperatura, os recursos hídricos e a distância à rede hidrográfica. A subsequente interpolação da 1ª componente principal (CP1) e, das principais variáveis associadas as componentes CP2, CP3 e CP4 como variáveis auxiliares, recorrendo a técnicas geoestatística em ambiente ArcGIS permitiu a obtenção de uma carta representando 84% da variação das características biofísicas no território. A análise em clusters validada pelo teste “t de Student” permitiu reclassificar o território em 6 unidades biofísicas homogéneas. Conclui-se que, as tecnologias de informação geográfica actualmente disponíveis a par de facilitar análises interactivas e flexíveis, possibilitando que se faça variar temas e critérios, integrar novas informações e introduzir melhorias em modelos construídos com bases em informações disponíveis num determinado contexto, associadas a técnicas de análise estatística multivariada, possibilitam, com base em critérios científicos, desenvolver a análise integrada de múltiplas variáveis biofísicas cuja correlação entre si, torna complexa a compreensão integrada dos fenómenos.
Resumo:
No momento em que se verifica algumas reformas no sistema educativo cabo-verdiano, torna-se necessário analisar a questão da formação dos professores de forma a prepará-los para a função que têm que desempenhar. Com a realização deste estudo, pretende-se diagnosticar percursos e necessidade de formação dos professores na área das Tecnologias de Informação e Comunicação (TIC), bem como descrever o sentido de competência (auto-eficácia) dos professores na utilização das tecnologias, concretamente à mobilização dos mesmos no processo ensino e aprendizagem. Tendo em conta os objetivos do estudo, optou-se por utilizar uma metodologia de natureza quantitativa. O estudo integra a participação de 87 professores. Escolheu-se a técnica de inquérito, realizando um inquérito por questionário com questões fechadas aos professores da Escola Secundária Abílio Duarte situada na Cidade da Praia, ilha de Santiago, Cabo Verde. A revisão da literatura permitiu verificar que já foram desenvolvidas iniciativas, a nível nacional, para a implementação das TIC nas escolas. Atualmente destaca-se o programa Mundu Novu do governo de Cabo Verde, coordenado pelo Ministério da Educação, que tem como objetivo modernizar o processo de ensino através da utilização das TIC criando um novo paradigma de ensino interativo. Os resultados apontam para a progressiva utilização das TIC nas atividades dos professores que revelam um moderado sentido de auto-eficácia de utilização das TIC. A formação de professores é apontada como o principal obstáculo à integração e utilização educativa das TIC.
Resumo:
Adequate in-vitro training in valved stents deployment as well as testing of the latter devices requires compliant real-size models of the human aortic root. The casting methods utilized up to now are multi-step, time consuming and complicated. We pursued a goal of building a flexible 3D model in a single-step procedure. We created a precise 3D CAD model of a human aortic root using previously published anatomical and geometrical data and printed it using a novel rapid prototyping system developed by the Fab@Home project. As a material for 3D fabrication we used common house-hold silicone and afterwards dip-coated several models with dispersion silicone one or two times. To assess the production precision we compared the size of the final product with the CAD model. Compliance of the models was measured and compared with native porcine aortic root. Total fabrication time was 3 h and 20 min. Dip-coating one or two times with dispersion silicone if applied took one or two extra days, respectively. The error in dimensions of non-coated aortic root model compared to the CAD design was <3.0% along X, Y-axes and 4.1% along Z-axis. Compliance of a non-coated model as judged by the changes of radius values in the radial direction by 16.39% is significantly different (P<0.001) from native aortic tissue--23.54% at the pressure of 80-100 mmHg. Rapid prototyping of compliant, life-size anatomical models with the Fab@Home 3D printer is feasible--it is very quick compared to previous casting methods.
Resumo:
Schizotypy refers to a constellation of personality traits that are believed to mirror the subclinical expression of schizophrenia in the general population. Evidence from pharmacological studies indicates that dopamine is involved in the aetiology of schizophrenia. Based on the assumption of a continuum between schizophrenia and schizotypy, researchers have begun investigating the association between dopamine and schizotypy using a wide range of methods. In this article, we review published studies on this association from the following areas of work: (1) Experimental investigations of the interactive effects of dopaminergic challenges and schizotypy on cognition, motor control and behaviour, (2) dopaminergically supported cognitive functions, (3) studies of associations between schizotypy and polymorphisms in genes involved in dopaminergic neurotransmission, and (4) molecular imaging studies of the association between schizotypy and markers of the dopamine system. Together, data from these lines of evidence suggest that dopamine is important to the expression and experience of schizotypy and associated behavioural biases. An important observation is that the experimental designs, methods, and manipulations used in this research are highly heterogeneous. Future studies are required to replicate individual observations, to enlighten the link between dopamine and different schizotypy dimensions (positive, negative, cognitive disorganisation), and to guide the search for solid dopamine-sensitive behavioural markers. Such studies are important in order to clarify inconsistencies between studies. More work is also needed to identify differences between dopaminergic alterations in schizotypy compared to the dysfunctions observed in schizophrenia.
Resumo:
OBJECTIVES: A new caval tree system was designed for realistic in vitro simulation. The objective of our study was to assess cannula performance for virtually wall-less versus standard percutaneous thin-walled venous cannulas in a setting of venous collapse in case of negative pressure. METHODS: For a collapsible caval model, a very flexible plastic material was selected, and a model with nine afferent veins was designed according to the anatomy of the vena cava. A flow bench was built including a lower reservoir holding the caval tree, built by taking into account the main afferent vessels and their flow provided by a reservoir 6 cm above. A cannula was inserted in this caval tree and connected to a centrifugal pump that, in turn, was connected to a reservoir positioned 83 cm above the second lower reservoir (after-load = 60 mmHg). Using the same pre-load, the simulated venous drainage for cardiopulmonary bypass was realized using a 24 F wall-less cannula (Smartcanula) and 25 F percutaneous cannula (Biomedicus), and stepwise increased augmentation (1500 RPM, 2000 and 2500 RPM) of venous drainage. RESULTS: For the thin wall and the wall-less cannulas, 36 pairs of flow and pressure measurements were realized for three different RPM values. The mean Q-values at 1500, 2000 and 2500 RPM were: 3.98 ± 0.01, 6.27 ± 0.02 and 9.81 ± 0.02 l/min for the wall-less cannula (P <0.0001), versus 2.74 ± 0.02, 3.06 ± 0.05, 6.78 ± 0.02 l/min for the thin-wall cannula (P <0.0001). The corresponding inlet pressure values were: -8.88 ± 0.01, -23.69 ± 0.81 and -70.22 ± 0.18 mmHg for the wall-less cannula (P <0.0001), versus -36.69 ± 1.88, -80.85 ± 1.71 and -101.83 ± 0.45 mmHg for the thin-wall cannula (P <0.0001). The thin-wall cannula showed mean Q-values 37% less and mean P values 26% more when compared with the wall-less cannula (P <0.0001). CONCLUSIONS: Our in vitro water test was able to mimic a negative pressure situation, where the wall-less cannula design performs better compared with the traditional thin-wall cannula.