79 resultados para interactive referents
Resumo:
Cet article passe en revue les côtés positifs et les risques liés à l'utilisation des nouvelles technologies. Parmi les aspects positifs, figurent l'accès rapide à des informations à caractère éducatif, la possibilité d'échanges à caractère social ou intellectuel, et l'ouverture à des renseignements dans le domaine de la santé, des drogues ou de la sexualité. Les menaces sont liées à l'inexpérience et à la difficulté à juger de la validité des informations, ou du risque que les jeunes prennent en diffusant des informations sur le web. La dépendance à internet et le risque de violence sont probablement surestimés. Un monitorage ouvert et interactif de la part des parents constitue une bonne prévention. L'investigation du rapport que chaque jeune patient entretient avec ces outils fait partie intégrante de tout bilan de santé. [Abstract] This reviews the use of new technologies with its benefits and pitfalls. Among the positive aspects are instant access to educational information, the possibility to connect with mates and to develop social exchanges, as well as an access to preventive contents in the field of health, substances or sexuality. The risks are linked with inexperience and the adolescent's inability to assess the validity of information collected on Internet, as well as a lack of insights of the consequences of launching information or images on the web. The menace of addiction or of violence induced by intemet is probably overestimated by many adults but should be taken into account. Parents should monitor their adolescents' activity in an open and interactive way. Moreover, any adolescent medical check-up should touch on the use of new technologies.
Resumo:
We diagnosed a non-small cell lung carcinoma in a 49-year-old female patient with the histopathological diagnosis of stage IIIB mixed bronchioloalveolar and papillary adenocarcinoma with extensive micropapillary feature, which was not visualized on the preoperative multimodality imaging with positron emission tomography (PET) and computed tomography (CT). The micropapillary component characterized by a unique growth pattern with particular morphological features can be observed in all subtypes of lung adenocarcinoma. Micropapillary component is increasingly recognized as a distinct entity associated with higher aggressiveness. Even the most modern multimodality PET/CT imaging technology may fail to adequately visualize this important component with highly relevant prognostic implications. Thus, the pathologist needs to consciously look for a micropapillary component in the surgical specimen or in preoperative biopsies or cytology. This may have potential future treatment implications, as adjuvant or neoadjuvant chemotherapy may be of relevance, even in the early stages of the disease.
Resumo:
A traditional photonic-force microscope (PFM) results in huge sets of data, which requires tedious numerical analysis. In this paper, we propose instead an analog signal processor to attain real-time capabilities while retaining the richness of the traditional PFM data. Our system is devoted to intracellular measurements and is fully interactive through the use of a haptic joystick. Using our specialized analog hardware along with a dedicated algorithm, we can extract the full 3D stiffness matrix of the optical trap in real time, including the off-diagonal cross-terms. Our system is also capable of simultaneously recording data for subsequent offline analysis. This allows us to check that a good correlation exists between the classical analysis of stiffness and our real-time measurements. We monitor the PFM beads using an optical microscope. The force-feedback mechanism of the haptic joystick helps us in interactively guiding the bead inside living cells and collecting information from its (possibly anisotropic) environment. The instantaneous stiffness measurements are also displayed in real time on a graphical user interface. The whole system has been built and is operational; here we present early results that confirm the consistency of the real-time measurements with offline computations.
Resumo:
Le travail d'un(e) expert(e) en science forensique exige que ce dernier (cette dernière) prenne une série de décisions. Ces décisions sont difficiles parce qu'elles doivent être prises dans l'inévitable présence d'incertitude, dans le contexte unique des circonstances qui entourent la décision, et, parfois, parce qu'elles sont complexes suite à de nombreuse variables aléatoires et dépendantes les unes des autres. Etant donné que ces décisions peuvent aboutir à des conséquences sérieuses dans l'administration de la justice, la prise de décisions en science forensique devrait être soutenue par un cadre robuste qui fait des inférences en présence d'incertitudes et des décisions sur la base de ces inférences. L'objectif de cette thèse est de répondre à ce besoin en présentant un cadre théorique pour faire des choix rationnels dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. L'inférence et la théorie de la décision bayésienne satisfont les conditions nécessaires pour un tel cadre théorique. Pour atteindre son objectif, cette thèse consiste de trois propositions, recommandant l'utilisation (1) de la théorie de la décision, (2) des réseaux bayésiens, et (3) des réseaux bayésiens de décision pour gérer des problèmes d'inférence et de décision forensiques. Les résultats présentent un cadre uniforme et cohérent pour faire des inférences et des décisions en science forensique qui utilise les concepts théoriques ci-dessus. Ils décrivent comment organiser chaque type de problème en le décomposant dans ses différents éléments, et comment trouver le meilleur plan d'action en faisant la distinction entre des problèmes de décision en une étape et des problèmes de décision en deux étapes et en y appliquant le principe de la maximisation de l'utilité espérée. Pour illustrer l'application de ce cadre à des problèmes rencontrés par les experts dans un laboratoire de science forensique, des études de cas théoriques appliquent la théorie de la décision, les réseaux bayésiens et les réseaux bayésiens de décision à une sélection de différents types de problèmes d'inférence et de décision impliquant différentes catégories de traces. Deux études du problème des deux traces illustrent comment la construction de réseaux bayésiens permet de gérer des problèmes d'inférence complexes, et ainsi surmonter l'obstacle de la complexité qui peut être présent dans des problèmes de décision. Trois études-une sur ce qu'il faut conclure d'une recherche dans une banque de données qui fournit exactement une correspondance, une sur quel génotype il faut rechercher dans une banque de données sur la base des observations faites sur des résultats de profilage d'ADN, et une sur s'il faut soumettre une trace digitale à un processus qui compare la trace avec des empreintes de sources potentielles-expliquent l'application de la théorie de la décision et des réseaux bayésiens de décision à chacune de ces décisions. Les résultats des études des cas théoriques soutiennent les trois propositions avancées dans cette thèse. Ainsi, cette thèse présente un cadre uniforme pour organiser et trouver le plan d'action le plus rationnel dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. Le cadre proposé est un outil interactif et exploratoire qui permet de mieux comprendre un problème de décision afin que cette compréhension puisse aboutir à des choix qui sont mieux informés. - Forensic science casework involves making a sériés of choices. The difficulty in making these choices lies in the inévitable presence of uncertainty, the unique context of circumstances surrounding each décision and, in some cases, the complexity due to numerous, interrelated random variables. Given that these décisions can lead to serious conséquences in the admin-istration of justice, forensic décision making should be supported by a robust framework that makes inferences under uncertainty and décisions based on these inferences. The objective of this thesis is to respond to this need by presenting a framework for making rational choices in décision problems encountered by scientists in forensic science laboratories. Bayesian inference and décision theory meets the requirements for such a framework. To attain its objective, this thesis consists of three propositions, advocating the use of (1) décision theory, (2) Bayesian networks, and (3) influence diagrams for handling forensic inference and décision problems. The results present a uniform and coherent framework for making inferences and décisions in forensic science using the above theoretical concepts. They describe how to organize each type of problem by breaking it down into its différent elements, and how to find the most rational course of action by distinguishing between one-stage and two-stage décision problems and applying the principle of expected utility maximization. To illustrate the framework's application to the problems encountered by scientists in forensic science laboratories, theoretical case studies apply décision theory, Bayesian net-works and influence diagrams to a selection of différent types of inference and décision problems dealing with différent catégories of trace evidence. Two studies of the two-trace problem illustrate how the construction of Bayesian networks can handle complex inference problems, and thus overcome the hurdle of complexity that can be present in décision prob-lems. Three studies-one on what to conclude when a database search provides exactly one hit, one on what genotype to search for in a database based on the observations made on DNA typing results, and one on whether to submit a fingermark to the process of comparing it with prints of its potential sources-explain the application of décision theory and influ¬ence diagrams to each of these décisions. The results of the theoretical case studies support the thesis's three propositions. Hence, this thesis présents a uniform framework for organizing and finding the most rational course of action in décision problems encountered by scientists in forensic science laboratories. The proposed framework is an interactive and exploratory tool for better understanding a décision problem so that this understanding may lead to better informed choices.
Resumo:
Objective: To test the efficacy of teaching motivational interviewing (MI) to medical students. Methods: Thirteen 4th year medical students volunteered to participate. Seven days before and 7 days after an 8-hour interactive training MI workshop, each student performed a videorecorded interview with two standardized patients: a 60 year old alcohol dependent woman and a 50 year old cigarette smoking man. Students' counseling skills were coded by two blinded clinicians using the Motivational Interviewing Treatment Integrity 3.0 (MITI). Inter-rater reliability was calculated for all interviews and a test-retest was completed in a sub-sample of 10 consecutive interviews three days apart. Difference between MITI scores before and after training were calculated and tested using non-parametric tests. Effect size was approximated by calculating the probability that posttest scores are greater than pretest scores (P*=P(Pre<Post)+1/2P(Pre=Post)), P*>1/2 indicating greater scores in posttest, P*=1/2 no effect, and P*<1/2 smaller scores in posttest. Results: Median differences between MITI scores before and after MI training indicated a general progression in MI skills: MI spirit global score (median difference=1.5, Inter quartile range=1.5, p<0.001, P*=0.90); Empathy global score (med diff=1, IQR=0.5, p<0.001, P*=0.85); Percentage of MI adherent skills (med diff=36.6, IQR=50.5, p<0.001, P*=0.85); Percentage of open questions (med diff=18.6, IQR=21.6, p<0.001, P*=0.96); reflections/ questions ratio (med diff=0.2, IQR=0.4, p<0.001, P*=0.81). Only Direction global score and the percentage of complex reflections were not significantly improved (med diff=0, IQR=1, p=0.53, P*=0.44, and med diff=4.3, IQR=24.8, p=0.48, P*=0.62, respectively). Inter-rater reliability indicated weighted kappa ranged between 0.14 for Direction to 0.51 for Collaboration and ICC ranged between 0.28 for Simple reflection to 0.95 for Closed question. Test-retests indicated weighted kappa ranged between 0.27 for Direction to 0.80 for Empathy and ICC ranged between 0.87 for Complex reflection to 0.98 for Closed question. Conclusion: This pilot study indicated that an 8-hour training in MI for voluntary 4th year medical students resulted in significant improvement of MI skills. Larger sample of unselected medical students should be studied to generalize the benefit of MI training to medical students. Interrater reliability and test-retests suggested that coders' training should be intensified.
Resumo:
A major challenge in this era of rapid climate change is to predict changes in species distributions and their impacts on ecosystems, and, if necessary, to recommend management strategies for maintenance of biodiversity or ecosystem services. Biological invasions, studied in most biomes of the world, can provide useful analogs for some of the ecological consequences of species distribution shifts in response to climate change. Invasions illustrate the adaptive and interactive responses that can occur when species are confronted with new environmental conditions. Invasion ecology complements climate change research and provides insights into the following questions: i) how will species distributions respond to climate change? ii) how will species movement affect recipient ecosystems? and iii) should we, and if so how can we, manage species and ecosystems in the face of climate change? Invasion ecology demonstrates that a trait-based approach can help to predict spread speeds and impacts on ecosystems, and has the potential to predict climate change impacts on species ranges and recipient ecosystems. However, there is a need to analyse traits in the context of life-history and demography, the stage in the colonisation process (e.g., spread, establishment or impact), the distribution of suitable habitats in the landscape, and the novel abiotic and biotic conditions under which those traits are expressed. As is the case with climate change, invasion ecology is embedded within complex societal goals. Both disciplines converge on similar questions of "when to intervene?" and "what to do?" which call for a better understanding of the ecological processes and social values associated with changing ecosystems.
Resumo:
Abstract The object of game theory lies in the analysis of situations where different social actors have conflicting requirements and where their individual decisions will all influence the global outcome. In this framework, several games have been invented to capture the essence of various dilemmas encountered in many common important socio-economic situations. Even though these games often succeed in helping us understand human or animal behavior in interactive settings, some experiments have shown that people tend to cooperate with each other in situations for which classical game theory strongly recommends them to do the exact opposite. Several mechanisms have been invoked to try to explain the emergence of this unexpected cooperative attitude. Among them, repeated interaction, reputation, and belonging to a recognizable group have often been mentioned. However, the work of Nowak and May (1992) showed that the simple fact of arranging the players according to a spatial structure and only allowing them to interact with their immediate neighbors is sufficient to sustain a certain amount of cooperation even when the game is played anonymously and without repetition. Nowak and May's study and much of the following work was based on regular structures such as two-dimensional grids. Axelrod et al. (2002) showed that by randomizing the choice of neighbors, i.e. by actually giving up a strictly local geographical structure, cooperation can still emerge, provided that the interaction patterns remain stable in time. This is a first step towards a social network structure. However, following pioneering work by sociologists in the sixties such as that of Milgram (1967), in the last few years it has become apparent that many social and biological interaction networks, and even some technological networks, have particular, and partly unexpected, properties that set them apart from regular or random graphs. Among other things, they usually display broad degree distributions, and show small-world topological structure. Roughly speaking, a small-world graph is a network where any individual is relatively close, in terms of social ties, to any other individual, a property also found in random graphs but not in regular lattices. However, in contrast with random graphs, small-world networks also have a certain amount of local structure, as measured, for instance, by a quantity called the clustering coefficient. In the same vein, many real conflicting situations in economy and sociology are not well described neither by a fixed geographical position of the individuals in a regular lattice, nor by a random graph. Furthermore, it is a known fact that network structure can highly influence dynamical phenomena such as the way diseases spread across a population and ideas or information get transmitted. Therefore, in the last decade, research attention has naturally shifted from random and regular graphs towards better models of social interaction structures. The primary goal of this work is to discover whether or not the underlying graph structure of real social networks could give explanations as to why one finds higher levels of cooperation in populations of human beings or animals than what is prescribed by classical game theory. To meet this objective, I start by thoroughly studying a real scientific coauthorship network and showing how it differs from biological or technological networks using divers statistical measurements. Furthermore, I extract and describe its community structure taking into account the intensity of a collaboration. Finally, I investigate the temporal evolution of the network, from its inception to its state at the time of the study in 2006, suggesting also an effective view of it as opposed to a historical one. Thereafter, I combine evolutionary game theory with several network models along with the studied coauthorship network in order to highlight which specific network properties foster cooperation and shed some light on the various mechanisms responsible for the maintenance of this same cooperation. I point out the fact that, to resist defection, cooperators take advantage, whenever possible, of the degree-heterogeneity of social networks and their underlying community structure. Finally, I show that cooperation level and stability depend not only on the game played, but also on the evolutionary dynamic rules used and the individual payoff calculations. Synopsis Le but de la théorie des jeux réside dans l'analyse de situations dans lesquelles différents acteurs sociaux, avec des objectifs souvent conflictuels, doivent individuellement prendre des décisions qui influenceront toutes le résultat global. Dans ce cadre, plusieurs jeux ont été inventés afin de saisir l'essence de divers dilemmes rencontrés dans d'importantes situations socio-économiques. Bien que ces jeux nous permettent souvent de comprendre le comportement d'êtres humains ou d'animaux en interactions, des expériences ont montré que les individus ont parfois tendance à coopérer dans des situations pour lesquelles la théorie classique des jeux prescrit de faire le contraire. Plusieurs mécanismes ont été invoqués pour tenter d'expliquer l'émergence de ce comportement coopératif inattendu. Parmi ceux-ci, la répétition des interactions, la réputation ou encore l'appartenance à des groupes reconnaissables ont souvent été mentionnés. Toutefois, les travaux de Nowak et May (1992) ont montré que le simple fait de disposer les joueurs selon une structure spatiale en leur permettant d'interagir uniquement avec leurs voisins directs est suffisant pour maintenir un certain niveau de coopération même si le jeu est joué de manière anonyme et sans répétitions. L'étude de Nowak et May, ainsi qu'un nombre substantiel de travaux qui ont suivi, étaient basés sur des structures régulières telles que des grilles à deux dimensions. Axelrod et al. (2002) ont montré qu'en randomisant le choix des voisins, i.e. en abandonnant une localisation géographique stricte, la coopération peut malgré tout émerger, pour autant que les schémas d'interactions restent stables au cours du temps. Ceci est un premier pas en direction d'une structure de réseau social. Toutefois, suite aux travaux précurseurs de sociologues des années soixante, tels que ceux de Milgram (1967), il est devenu clair ces dernières années qu'une grande partie des réseaux d'interactions sociaux et biologiques, et même quelques réseaux technologiques, possèdent des propriétés particulières, et partiellement inattendues, qui les distinguent de graphes réguliers ou aléatoires. Entre autres, ils affichent en général une distribution du degré relativement large ainsi qu'une structure de "petit-monde". Grossièrement parlant, un graphe "petit-monde" est un réseau où tout individu se trouve relativement près de tout autre individu en termes de distance sociale, une propriété également présente dans les graphes aléatoires mais absente des grilles régulières. Par contre, les réseaux "petit-monde" ont, contrairement aux graphes aléatoires, une certaine structure de localité, mesurée par exemple par une quantité appelée le "coefficient de clustering". Dans le même esprit, plusieurs situations réelles de conflit en économie et sociologie ne sont pas bien décrites ni par des positions géographiquement fixes des individus en grilles régulières, ni par des graphes aléatoires. De plus, il est bien connu que la structure même d'un réseau peut passablement influencer des phénomènes dynamiques tels que la manière qu'a une maladie de se répandre à travers une population, ou encore la façon dont des idées ou une information s'y propagent. Ainsi, durant cette dernière décennie, l'attention de la recherche s'est tout naturellement déplacée des graphes aléatoires et réguliers vers de meilleurs modèles de structure d'interactions sociales. L'objectif principal de ce travail est de découvrir si la structure sous-jacente de graphe de vrais réseaux sociaux peut fournir des explications quant aux raisons pour lesquelles on trouve, chez certains groupes d'êtres humains ou d'animaux, des niveaux de coopération supérieurs à ce qui est prescrit par la théorie classique des jeux. Dans l'optique d'atteindre ce but, je commence par étudier un véritable réseau de collaborations scientifiques et, en utilisant diverses mesures statistiques, je mets en évidence la manière dont il diffère de réseaux biologiques ou technologiques. De plus, j'extrais et je décris sa structure de communautés en tenant compte de l'intensité d'une collaboration. Finalement, j'examine l'évolution temporelle du réseau depuis son origine jusqu'à son état en 2006, date à laquelle l'étude a été effectuée, en suggérant également une vue effective du réseau par opposition à une vue historique. Par la suite, je combine la théorie évolutionnaire des jeux avec des réseaux comprenant plusieurs modèles et le réseau de collaboration susmentionné, afin de déterminer les propriétés structurelles utiles à la promotion de la coopération et les mécanismes responsables du maintien de celle-ci. Je mets en évidence le fait que, pour ne pas succomber à la défection, les coopérateurs exploitent dans la mesure du possible l'hétérogénéité des réseaux sociaux en termes de degré ainsi que la structure de communautés sous-jacente de ces mêmes réseaux. Finalement, je montre que le niveau de coopération et sa stabilité dépendent non seulement du jeu joué, mais aussi des règles de la dynamique évolutionnaire utilisées et du calcul du bénéfice d'un individu.
Resumo:
This dissertation focuses on the strategies consumers use when making purchase decisions. It is organized in two main parts, one centering on descriptive and the other on applied decision making research. In the first part, a new process tracing tool called InterActive Process Tracing (IAPT) is pre- sented, which I developed to investigate the nature of consumers' decision strategies. This tool is a combination of several process tracing techniques, namely Active Information Search, Mouselab, and retrospective verbal protocol. To validate IAPT, two experiments on mobile phone purchase de- cisions were conducted where participants first repeatedly chose a mobile phone and then were asked to formalize their decision strategy so that it could be used to make choices for them. The choices made by the identified strategies correctly predicted the observed choices in 73% (Experiment 1) and 67% (Experiment 2) of the cases. Moreover, in Experiment 2, Mouselab and eye tracking were directly compared with respect to their impact on information search and strategy description. Only minor differences were found between these two methods. I conclude that IAPT is a useful research tool to identify choice strategies, and that using eye tracking technology did not increase its validity beyond that gained with Mouselab. In the second part, a prototype of a decision aid is introduced that was developed building in particular on the knowledge about consumers' decision strategies gained in Part I. This decision aid, which is called the InterActive Choice Aid (IACA), systematically assists consumers in their purchase decisions. To evaluate the prototype regarding its perceived utility, an experiment was conducted where IACA was compared to two other prototypes that were based on real-world consumer decision aids. All three prototypes differed in the number and type of tools they provided to facilitate the process of choosing, ranging from low (Amazon) to medium (Sunrise/dpreview) to high functionality (IACA). Overall, participants slightly preferred the prototype of medium functionality and this prototype was also rated best on the dimensions of understandability and ease of use. IACA was rated best regarding the two dimensions of ease of elimination and ease of comparison of alternatives. Moreover, participants choices were more in line with the normatively oriented weighted additive strategy when they used IACA than when they used the medium functionality prototype. The low functionality prototype was the least preferred overall. It is concluded that consumers can and will benefit from highly functional decision aids like IACA, but only when these systems are easy to understand and to use.
Resumo:
OBJECTIVES: The reconstruction of the right ventricular outflow tract (RVOT) with valved conduits remains a challenge. The reoperation rate at 5 years can be as high as 25% and depends on age, type of conduit, conduit diameter and principal heart malformation. The aim of this study is to provide a bench model with computer fluid dynamics to analyse the haemodynamics of the RVOT, pulmonary artery, its bifurcation, and left and right pulmonary arteries that in the future may serve as a tool for analysis and prediction of outcome following RVOT reconstruction. METHODS: Pressure, flow and diameter at the RVOT, pulmonary artery, bifurcation of the pulmonary artery, and left and right pulmonary arteries were measured in five normal pigs with a mean weight of 24.6 ± 0.89 kg. Data obtained were used for a 3D computer fluid-dynamics simulation of flow conditions, focusing on the pressure, flow and shear stress profile of the pulmonary trunk to the level of the left and right pulmonary arteries. RESULTS: Three inlet steady flow profiles were obtained at 0.2, 0.29 and 0.36 m/s that correspond to the flow rates of 1.5, 2.0 and 2.5 l/min flow at the RVOT. The flow velocity profile was constant at the RVOT down to the bifurcation and decreased at the left and right pulmonary arteries. In all three inlet velocity profiles, low sheer stress and low-velocity areas were detected along the left wall of the pulmonary artery, at the pulmonary artery bifurcation and at the ostia of both pulmonary arteries. CONCLUSIONS: This computed fluid real-time model provides us with a realistic picture of fluid dynamics in the pulmonary tract area. Deep shear stress areas correspond to a turbulent flow profile that is a predictive factor for the development of vessel wall arteriosclerosis. We believe that this bench model may be a useful tool for further evaluation of RVOT pathology following surgical reconstructions.
Resumo:
This paper identifies selected issues and lessons learned from the implementation of a national program of prevention and control of non-communicable diseases (NCD) during the past 20 years in the Seychelles, a small island state in the African region. As early as in 1989, population-based surveys demonstrated high levels of several cardiovascular risk factors, which prompted an organized response by the government. The early creation of a NCD unit within the Ministry of Health, coupled with cooperation with international partners, enabled incremental capacity building and coherent development of NCD programs and policy. Information campaigns and screening for hypertension and diabetes in work/public places raised awareness and rallied increasingly broad awareness and support to NCD prevention and control. A variety of interventions were organized for tobacco control and comprehensive tobacco control legislation was enacted in 2009 (including total bans on tobacco advertising and on smoking in all enclosed public and work places). A recent School Nutrition Policy prohibits the sale of soft drinks in schools. At primary health care level, guidelines were developed for the management of hypertension and diabetes (these conditions are managed in all health centers within a national health system); regular interactive education sessions were organized for groups of high risk patients ("heart health club"); and specialized "NCD nurses" were trained. Decreasing prevalence of smoking is evidence of success, but the raising "diabesity epidemic" calls for strengthened health care to high-risk patients and broader multisectoral policy to mould an environment conducive to healthy behaviors. Key components of NCD prevention and control in Seychelles include effective surveillance mechanisms supplemented by focused research; generating broad interest and consensus on the need for prevention and control of NCD; mobilizing leadership and commitment at all levels; involving local and international expertise; building on existing efforts; and seeking integrated, multi-disciplinary and multisectoral approaches.
Resumo:
This paper presents a pilot project to reinforce participatory practices in standardization. The INTERNORM project is funded by the University of Lausanne, Switzerland. It aims to create an interactive knowledge center based on the sharing of academic skills and the experiences accumulated by the civil society, especially consumer associations, environmental associations and trade unions to strengthen the participatory process of standardization. The first objective of the project is action-oriented: INTERNORM provides a common knowledge pool supporting the participation of civil society actors to international standard-setting activities by bringing them together with academic experts in working groups and by providing logistic and financial support to their participation to meetings of national and international technical committees. The second objective of the project is analytical: the standardization action initiated through INTERNORM provides a research field for a better understanding of the participatory dynamics underpinning international standardization. The paper presents three incentives that explain civil society (non-)involvement in standardization that try to overcome conventional resource-based hypotheses: an operational incentive, related to the use of standards in the selective goods provided by associations to their membership; a thematic incentive, provided by the setting of priorities by strategic committees created in some standardization organization; a rhetorical incentive, related to the discursive resource that civil society concerns offers to the different stakeholders.
Resumo:
The Microbe browser is a web server providing comparative microbial genomics data. It offers comprehensive, integrated data from GenBank, RefSeq, UniProt, InterPro, Gene Ontology and the Orthologs Matrix Project (OMA) database, displayed along with gene predictions from five software packages. The Microbe browser is daily updated from the source databases and includes all completely sequenced bacterial and archaeal genomes. The data are displayed in an easy-to-use, interactive website based on Ensembl software. The Microbe browser is available at http://microbe.vital-it.ch/. Programmatic access is available through the OMA application programming interface (API) at http://microbe.vital-it.ch/api.
Resumo:
Adequate in-vitro training in valved stents deployment as well as testing of the latter devices requires compliant real-size models of the human aortic root. The casting methods utilized up to now are multi-step, time consuming and complicated. We pursued a goal of building a flexible 3D model in a single-step procedure. We created a precise 3D CAD model of a human aortic root using previously published anatomical and geometrical data and printed it using a novel rapid prototyping system developed by the Fab@Home project. As a material for 3D fabrication we used common house-hold silicone and afterwards dip-coated several models with dispersion silicone one or two times. To assess the production precision we compared the size of the final product with the CAD model. Compliance of the models was measured and compared with native porcine aortic root. Total fabrication time was 3 h and 20 min. Dip-coating one or two times with dispersion silicone if applied took one or two extra days, respectively. The error in dimensions of non-coated aortic root model compared to the CAD design was <3.0% along X, Y-axes and 4.1% along Z-axis. Compliance of a non-coated model as judged by the changes of radius values in the radial direction by 16.39% is significantly different (P<0.001) from native aortic tissue--23.54% at the pressure of 80-100 mmHg. Rapid prototyping of compliant, life-size anatomical models with the Fab@Home 3D printer is feasible--it is very quick compared to previous casting methods.
Resumo:
Schizotypy refers to a constellation of personality traits that are believed to mirror the subclinical expression of schizophrenia in the general population. Evidence from pharmacological studies indicates that dopamine is involved in the aetiology of schizophrenia. Based on the assumption of a continuum between schizophrenia and schizotypy, researchers have begun investigating the association between dopamine and schizotypy using a wide range of methods. In this article, we review published studies on this association from the following areas of work: (1) Experimental investigations of the interactive effects of dopaminergic challenges and schizotypy on cognition, motor control and behaviour, (2) dopaminergically supported cognitive functions, (3) studies of associations between schizotypy and polymorphisms in genes involved in dopaminergic neurotransmission, and (4) molecular imaging studies of the association between schizotypy and markers of the dopamine system. Together, data from these lines of evidence suggest that dopamine is important to the expression and experience of schizotypy and associated behavioural biases. An important observation is that the experimental designs, methods, and manipulations used in this research are highly heterogeneous. Future studies are required to replicate individual observations, to enlighten the link between dopamine and different schizotypy dimensions (positive, negative, cognitive disorganisation), and to guide the search for solid dopamine-sensitive behavioural markers. Such studies are important in order to clarify inconsistencies between studies. More work is also needed to identify differences between dopaminergic alterations in schizotypy compared to the dysfunctions observed in schizophrenia.
Resumo:
OBJECTIVES: A new caval tree system was designed for realistic in vitro simulation. The objective of our study was to assess cannula performance for virtually wall-less versus standard percutaneous thin-walled venous cannulas in a setting of venous collapse in case of negative pressure. METHODS: For a collapsible caval model, a very flexible plastic material was selected, and a model with nine afferent veins was designed according to the anatomy of the vena cava. A flow bench was built including a lower reservoir holding the caval tree, built by taking into account the main afferent vessels and their flow provided by a reservoir 6 cm above. A cannula was inserted in this caval tree and connected to a centrifugal pump that, in turn, was connected to a reservoir positioned 83 cm above the second lower reservoir (after-load = 60 mmHg). Using the same pre-load, the simulated venous drainage for cardiopulmonary bypass was realized using a 24 F wall-less cannula (Smartcanula) and 25 F percutaneous cannula (Biomedicus), and stepwise increased augmentation (1500 RPM, 2000 and 2500 RPM) of venous drainage. RESULTS: For the thin wall and the wall-less cannulas, 36 pairs of flow and pressure measurements were realized for three different RPM values. The mean Q-values at 1500, 2000 and 2500 RPM were: 3.98 ± 0.01, 6.27 ± 0.02 and 9.81 ± 0.02 l/min for the wall-less cannula (P <0.0001), versus 2.74 ± 0.02, 3.06 ± 0.05, 6.78 ± 0.02 l/min for the thin-wall cannula (P <0.0001). The corresponding inlet pressure values were: -8.88 ± 0.01, -23.69 ± 0.81 and -70.22 ± 0.18 mmHg for the wall-less cannula (P <0.0001), versus -36.69 ± 1.88, -80.85 ± 1.71 and -101.83 ± 0.45 mmHg for the thin-wall cannula (P <0.0001). The thin-wall cannula showed mean Q-values 37% less and mean P values 26% more when compared with the wall-less cannula (P <0.0001). CONCLUSIONS: Our in vitro water test was able to mimic a negative pressure situation, where the wall-less cannula design performs better compared with the traditional thin-wall cannula.