994 resultados para Computer Experiments
Resumo:
A sample of Biomphalaria amazonica from Porto Velho, Rondônia state, was exposed to miracidia of Schistosoma mansoni (SJ2 strain) from São José dos Campos, São Paulo state (five miracidia per snail). Water freshly taken from the snails' breeding place was used to make sure that its quality was compatible with hatching of miracidia and their penetration into the snails. The resulting infection rate was 3.5%, as against 45% in B. tenagophila controls. In comparison with the controls, B. amazonica, besides a lower infection rate, showed a longer prepatent period and a lower cercarial production. These characteristics seem to indicate that it is a poor host of S. mansoni, like B. straminea, but it should be considered that, this notwithstanding, the latter is admittedly a good vector of the parasite in hyperendemic areas of northeastern Brazil. These results point to the possibility of introduction of schistosomiasis mansoni into the western Amazonian region, where B. amazonica is widespread.
Resumo:
The aim of this retrospective study was to compare the clinical and radiographic results after TKA (PFC, DePuy), performed either by computer assisted navigation (CAS, Brainlab, Johnson&Johnson) or by conventional means. Material and methods: Between May and December 2006 we reviewed 36 conventional TKA performed between 2002 and 2003 (group A) and 37 navigated TKA performed between 2005 and 2006 (group B) by the same experienced surgeon. The mean age in group A was 74 years (range 62-90) and 73 (range 58-85) in group B with a similar age distribution. The preoperative mechanical axes in group A ranged from -13° varus to +13° valgus (mean absolute deviation 6.83°, SD 3.86), in group B from -13° to +16° (mean absolute deviation 5.35, SD 4.29). Patients with a previous tibial osteotomy or revision arthroplasty were excluded from the study. Examination was done by an experienced orthopedic resident independent of the surgeon. All patients had pre- and postoperative long standing radiographs. The IKSS and the WOMAC were utilized to determine the clinical outcome. Patient's degree of satisfaction was assessed on a visual analogous scale (VAS). Results: 32 of the 37 navigated TKAs (86,5%) showed a postoperative mechanical axis within the limits of 3 degrees of valgus or varus deviation compared to only 24 (66%) of the 36 standard TKAs. This difference was significant (p = 0.045). The mean absolute deviation from neutral axis was 3.00° (range -5° to +9°, SD: 1.75) in group A in comparison to 1.54° (range -5° to +4°, SD: 1.41) in group B with a highly significant difference (p = 0.000). Furthermore, both groups showed a significant postoperative improvement of their mean IKSS-values (group A: 89 preoperative to 169 postoperative, group B 88 to 176) without a significant difference between the two groups. Neither the WOMAC nor the patient's degree of satisfaction - as assessed by VAS - showed significant differences. Operation time was significantly higher in group B (mean 119.9 min.) than in group A (mean 99.6 min., p <0.000). Conclusion: Our study showed consistent significant improvement of postoperative frontal alignment in TKA by computer assisted navigation (CAS) compared to standard methods, even in the hands of a surgeon well experienced in standard TKA implantation. However, the follow-up time of this study was not long enough to judge differences in clinical outcome. Thus, the relevance of computer navigation for clinical outcome and survival of TKA remains to be proved in long term studies to justify the longer operation time. References 1 Stulberg SD. Clin Orth Rel Res. 2003;(416):177-84. 2 Chauhan SK. JBJS Br. 2004;86(3):372-7. 3 Bäthis H, et al. Orthopäde. 2006;35(10):1056-65.
Resumo:
With the aid of the cobalt labelling technique, frog spinal cord motor neuron dendrites of the subpial dendritic plexus have been identified in serial electron micrographs. Computer reconstructions of various lengths (2.5-9.8 micron) of dendritic segments showed the contours of these dendrites to be highly irregular, and to present many thorn-like projections 0.4-1.8 micron long. Number, size and distribution of synaptic contacts were also determined. Almost half of the synapses occurred at the origins of the thorns and these synapses had the largest contact areas. Only 8 out of 54 synapses analysed were found on thorns and these were the smallest. For the total length of reconstructed dendrites there was, on average, one synapse per 1.2 micron, while 4.4% of the total dendritic surface was covered with synaptic contacts. The functional significance of these distal dendrites and their capacity to influence the soma membrane potential is discussed.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
Report for the scientific sojourn at the Swiss Federal Institute of Technology Zurich, Switzerland, between September and December 2007. In order to make robots useful assistants for our everyday life, the ability to learn and recognize objects is of essential importance. However, object recognition in real scenes is one of the most challenging problems in computer vision, as it is necessary to deal with difficulties. Furthermore, in mobile robotics a new challenge is added to the list: computational complexity. In a dynamic world, information about the objects in the scene can become obsolete before it is ready to be used if the detection algorithm is not fast enough. Two recent object recognition techniques have achieved notable results: the constellation approach proposed by Lowe and the bag of words approach proposed by Nistér and Stewénius. The Lowe constellation approach is the one currently being used in the robot localization project of the COGNIRON project. This report is divided in two main sections. The first section is devoted to briefly review the currently used object recognition system, the Lowe approach, and bring to light the drawbacks found for object recognition in the context of indoor mobile robot navigation. Additionally the proposed improvements for the algorithm are described. In the second section the alternative bag of words method is reviewed, as well as several experiments conducted to evaluate its performance with our own object databases. Furthermore, some modifications to the original algorithm to make it suitable for object detection in unsegmented images are proposed.
Resumo:
This study investigates the issue of self-selection of stakeholders into participation and collaboration in policy-relevant experiments. We document and test the implications of self-selection in the context of randomised policy experiment we conducted in primary schools in the UK. The main questions we ask are (1) is there evidence of selection on key observable characteristics likely to matter for the outcome of interest and (2) does selection matter for the estimates of treatment eff ects. The experimental work consists in testing the e ffects of an intervention aimed at encouraging children to make more healthy choices at lunch. We recruited schools through local authorities and randomised schools across two incentive treatments and a control group. We document the selection taking place both at the level of local authorities and at the school level. Overall we nd mild evidence of selection on key observables such as obesity levels and socio-economic characteristics. We find evidence of selection along indicators of involvement in healthy lifestyle programmes at the school level, but the magnitude is small. Moreover, We do not find signifi cant di erences in the treatment e ffects of the experiment between variables which, albeit to a mild degree, are correlated with selection into the experiment. To our knowledge, this is the rst study providing direct evidence on the magnitude of self-selection in fi eld experiments.
Resumo:
The ways in which preferences respond to the varying stress of economic environments is a key question for behavioral economics and public policy. We conducted a laboratory experiment to investigate the effects of stress on financial decision making among individuals aged 50 and older. Using the cold pressor task as a physiological stressor, and a series of intelligence tests as cognitive stressors, we find that stress increases subjective discounting rates, has no effect on the degree of risk-aversion, and substantially lowers the effort individuals make to learn about financial decisions.
Resumo:
Type 2 diabetes mellitus (T2DM) is a major disease affecting nearly 280 million people worldwide. Whilst the pathophysiological mechanisms leading to disease are poorly understood, dysfunction of the insulin-producing pancreatic beta-cells is key event for disease development. Monitoring the gene expression profiles of pancreatic beta-cells under several genetic or chemical perturbations has shed light on genes and pathways involved in T2DM. The EuroDia database has been established to build a unique collection of gene expression measurements performed on beta-cells of three organisms, namely human, mouse and rat. The Gene Expression Data Analysis Interface (GEDAI) has been developed to support this database. The quality of each dataset is assessed by a series of quality control procedures to detect putative hybridization outliers. The system integrates a web interface to several standard analysis functions from R/Bioconductor to identify differentially expressed genes and pathways. It also allows the combination of multiple experiments performed on different array platforms of the same technology. The design of this system enables each user to rapidly design a custom analysis pipeline and thus produce their own list of genes and pathways. Raw and normalized data can be downloaded for each experiment. The flexible engine of this database (GEDAI) is currently used to handle gene expression data from several laboratory-run projects dealing with different organisms and platforms. Database URL: http://eurodia.vital-it.ch.
Resumo:
Freehand positioning of the femoral drill guide is difficult during hip resurfacing and the surgeon is often unsure of the implant position achieved peroperatively. The purpose of this study was to find out whether, by using a navigation system, acetabular and femoral component positioning could be made easier and more precise. Eighteen patients operated on by the same surgeon were matched by sex, age, BMI, diagnosis and ASA score (nine patients with computer assistance, nine with the regular ancillary). Pre-operative planning was done on standard AP and axial radiographs with CT scan views for the computer-assisted operations. The final position of implants was evaluated by the same radiographs for all patients. The follow-up was at least 1 year. No difference between both groups in terms of femoral component position was observed (p > 0.05). There was also no difference in femoral notching. A trend for a better cup position was observed for the navigated hips, especially for cup anteversion. There was no additional operating time for the navigated hips. Hip navigation for resurfacing surgery may allow improved visualisation and hip implant positioning, but its advantage probably will be more obvious with mini-incisions than with regular incision surgery.
Resumo:
L’objectiu del projecte consisteix en l’estudi, simulació i implantació d’un conjunt d’aplicacions que permeten tenir un control sobre possibles problemes que puguin succeir a la nostra xarxa. Aquest projecte és la solució als problemes de detecció d’errors en el funcionament de les infraestructures de networking de les que disposen els nostres clients.
Resumo:
SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.
Resumo:
Images obtained from high-throughput mass spectrometry (MS) contain information that remains hidden when looking at a single spectrum at a time. Image processing of liquid chromatography-MS datasets can be extremely useful for quality control, experimental monitoring and knowledge extraction. The importance of imaging in differential analysis of proteomic experiments has already been established through two-dimensional gels and can now be foreseen with MS images. We present MSight, a new software designed to construct and manipulate MS images, as well as to facilitate their analysis and comparison.
Resumo:
The CIPA programme is a collaborative project including two entomologists from France and seven South and Central America countries. Its objective is the development of an expert system for computer aided identification of phlebotomine sandflies from the Americas. It also includes the formation of data bases for bibliographic, taxonomic and biogeographic data. Participant consensus on taxonomic prerequisites, standardization in bibliographic data collections and selection of descriptive variables for the final programme has been established through continous communication among participants and annual meetings. The adopted check-list of American sandflies presented here includes 386 specific taxa, ordered into genera and 28 sub-genera or species groups.
Resumo:
We aimed to determine whether human subjects' reliance on different sources of spatial information encoded in different frames of reference (i.e., egocentric versus allocentric) affects their performance, decision time and memory capacity in a short-term spatial memory task performed in the real world. Subjects were asked to play the Memory game (a.k.a. the Concentration game) without an opponent, in four different conditions that controlled for the subjects' reliance on egocentric and/or allocentric frames of reference for the elaboration of a spatial representation of the image locations enabling maximal efficiency. We report experimental data from young adult men and women, and describe a mathematical model to estimate human short-term spatial memory capacity. We found that short-term spatial memory capacity was greatest when an egocentric spatial frame of reference enabled subjects to encode and remember the image locations. However, when egocentric information was not reliable, short-term spatial memory capacity was greater and decision time shorter when an allocentric representation of the image locations with respect to distant objects in the surrounding environment was available, as compared to when only a spatial representation encoding the relationships between the individual images, independent of the surrounding environment, was available. Our findings thus further demonstrate that changes in viewpoint produced by the movement of images placed in front of a stationary subject is not equivalent to the movement of the subject around stationary images. We discuss possible limitations of classical neuropsychological and virtual reality experiments of spatial memory, which typically restrict the sensory information normally available to human subjects in the real world.
Resumo:
La E/S Paralela es un área de investigación que tiene una creciente importancia en el cómputo de Altas Prestaciones. Si bien durante años ha sido el cuello de botella de los computadores paralelos en la actualidad, debido al gran aumento del poder de cómputo, el problema de la E/S se ha incrementado y la comunidad del Cómputo de Altas Prestaciones considera que se debe trabajar en mejorar el sistema de E/S de los computadores paralelos, para lograr cubrir las exigencias de las aplicaciones científicas que usan HPC. La Configuración de la Entrada/Salida (E/S) Paralela tiene una gran influencia en las prestaciones y disponibilidad, por ello es importante “Analizar configuraciones de E/S paralela para identificar los factores claves que influyen en las prestaciones y disponibilidad de la E/S de Aplicaciones Científicas que se ejecutan en un clúster”. Para realizar el análisis de las configuraciones de E/S se propone una metodología que permite identificar los factores de E/S y evaluar su influencia para diferentes configuraciones de E/S formada por tres fases: Caracterización, Configuración y Evaluación. La metodología permite analizar el computador paralelo a nivel de Aplicación Científica, librerías de E/S y de arquitectura de E/S, pero desde el punto de vista de la E/S. Los experimentos realizados para diferentes configuraciones de E/S y los resultados obtenidos indican la complejidad del análisis de los factores de E/S y los diferentes grados de influencia en las prestaciones del sistema de E/S. Finalmente se explican los trabajos futuros, el diseño de un modelo que de soporte al proceso de Configuración del sistema de E/S paralela para aplicaciones científicas. Por otro lado, para identificar y evaluar los factores de E/S asociados con la disponibilidad a nivel de datos, se pretende utilizar la Arquitectura Tolerante a Fallos RADIC.