31 resultados para Metriche del software Stima del software Software embedded Function point Lines of code
Resumo:
Postsynaptic density-95/disks large/zonula occludens-1 (PDZ) domains are relatively small (80-120 residues) protein binding modules central in the organization of receptor clusters and in the association of cellular proteins. Their main function is to bind C-terminals of selected proteins that are recognized through specific amino acids in their carboxyl end. Binding is associated with a deformation of the PDZ native structure and is responsible for dynamical changes in regions not in direct contact with the target. We investigate how this deformation is related to the harmonic dynamics of the PDZ structure and show that one low-frequency collective normal mode, characterized by the concerted movements of different secondary structures, is involved in the binding process. Our results suggest that even minimal structural changes are responsible for communication between distant regions of the protein, in agreement with recent NMR experiments. Thus, PDZ domains are a very clear example of how collective normal modes are able to characterize the relation between function and dynamics of proteins, and to provide indications on the precursors of binding/unbinding events.
Resumo:
ExPASy (http://www.expasy.org) has worldwide reputation as one of the main bioinformatics resources for proteomics. It has now evolved, becoming an extensible and integrative portal accessing many scientific resources, databases and software tools in different areas of life sciences. Scientists can henceforth access seamlessly a wide range of resources in many different domains, such as proteomics, genomics, phylogeny/evolution, systems biology, population genetics, transcriptomics, etc. The individual resources (databases, web-based and downloadable software tools) are hosted in a 'decentralized' way by different groups of the SIB Swiss Institute of Bioinformatics and partner institutions. Specifically, a single web portal provides a common entry point to a wide range of resources developed and operated by different SIB groups and external institutions. The portal features a search function across 'selected' resources. Additionally, the availability and usage of resources are monitored. The portal is aimed for both expert users and people who are not familiar with a specific domain in life sciences. The new web interface provides, in particular, visual guidance for newcomers to ExPASy.
Resumo:
A crucial method for investigating patients with coronary artery disease (CAD) is the calculation of the left ventricular ejection fraction (LVEF). It is, consequently, imperative to precisely estimate the value of LVEF--a process that can be done with myocardial perfusion scintigraphy. Therefore, the present study aimed to establish and compare the estimation performance of the quantitative parameters of the reconstruction methods filtered backprojection (FBP) and ordered-subset expectation maximization (OSEM). METHODS: A beating-heart phantom with known values of end-diastolic volume, end-systolic volume, and LVEF was used. Quantitative gated SPECT/quantitative perfusion SPECT software was used to obtain these quantitative parameters in a semiautomatic mode. The Butterworth filter was used in FBP, with the cutoff frequencies between 0.2 and 0.8 cycles per pixel combined with the orders of 5, 10, 15, and 20. Sixty-three reconstructions were performed using 2, 4, 6, 8, 10, 12, and 16 OSEM subsets, combined with several iterations: 2, 4, 6, 8, 10, 12, 16, 32, and 64. RESULTS: With FBP, the values of end-diastolic, end-systolic, and the stroke volumes rise as the cutoff frequency increases, whereas the value of LVEF diminishes. This same pattern is verified with the OSEM reconstruction. However, with OSEM there is a more precise estimation of the quantitative parameters, especially with the combinations 2 iterations × 10 subsets and 2 iterations × 12 subsets. CONCLUSION: The OSEM reconstruction presents better estimations of the quantitative parameters than does FBP. This study recommends the use of 2 iterations with 10 or 12 subsets for OSEM and a cutoff frequency of 0.5 cycles per pixel with the orders 5, 10, or 15 for FBP as the best estimations for the left ventricular volumes and ejection fraction quantification in myocardial perfusion scintigraphy.
Resumo:
BACKGROUND: There is growing evidence that informal payments for health care are fairly common in many low- and middle-income countries. Informal payments are reported to have a negative consequence on equity and quality of care; it has been suggested, however, that they may contribute to health worker motivation and retention. Given the significance of motivation and retention issues in human resources for health, a better understanding of the relationships between the two phenomena is needed. This study attempts to assess whether and in what ways informal payments occur in Kibaha, Tanzania. Moreover, it aims to assess how informal earnings might help boost health worker motivation and retention. METHODS: Nine focus groups were conducted in three health facilities of different levels in the health system. In total, 64 health workers participated in the focus group discussions (81% female, 19% male) and where possible, focus groups were divided by cadre. All data were processed and analysed by means of the NVivo software package. RESULTS: The use of informal payments in the study area was confirmed by this study. Furthermore, a negative relationship between informal payments and job satisfaction and better motivation is suggested. Participants mentioned that they felt enslaved by patients as a result of being bribed and this resulted in loss of self-esteem. Furthermore, fear of detection was a main demotivating factor. These factors seem to counterbalance the positive effect of financial incentives. Moreover, informal payments were not found to be related to retention of health workers in the public health system. Other factors such as job security seemed to be more relevant for retention. CONCLUSION: This study suggests that the practice of informal payments contributes to the general demotivation of health workers and negatively affects access to health care services and quality of the health system. Policy action is needed that not only provides better financial incentives for individuals but also tackles an environment in which corruption is endemic.
Resumo:
AIMS: Although the coronary artery vessel wall can be imaged non-invasively using magnetic resonance imaging (MRI), the in vivo reproducibility of wall thickness measures has not been previously investigated. Using a refined magnetization preparation scheme, we sought to assess the reproducibility of three-dimensional (3D) free-breathing black-blood coronary MRI in vivo. METHODS AND RESULTS: MRI vessel wall scans parallel to the right coronary artery (RCA) were obtained in 18 healthy individuals (age range 25-43, six women), with no known history of coronary artery disease, using a 3D dual-inversion navigator-gated black-blood spiral imaging sequence. Vessel wall scans were repeated 1 month later in eight subjects. The visible vessel wall segment and the wall thickness were quantitatively assessed using a semi-automatic tool and the intra-observer, inter-observer, and inter-scan reproducibilities were determined. The average imaged length of the RCA vessel wall was 44.5+/-7 mm. The average wall thickness was 1.6+/-0.2 mm. There was a highly significant intra-observer (r=0.97), inter-observer (r=0.94), and inter-scan (r=0.90) correlation for wall thickness (all P<0.001). There was also a significant agreement for intra-observer, inter-observer, and inter-scan measurements on Bland-Altman analysis. The intra-class correlation coefficients for intra-observer (r=0.97), inter-observer (r=0.92), and inter-scan (r=0.86) analyses were also excellent. CONCLUSION: The use of black-blood free-breathing 3D MRI in conjunction with semi-automated analysis software allows for reproducible measurements of right coronary arterial vessel-wall thickness. This technique may be well-suited for non-invasive longitudinal studies of coronary atherosclerosis.
Resumo:
INTRODUCTION: Partial splenectomy in children is a good surgical option for hematological diseases and focal splenic tumors because it allows the preservation of the spleen's immunological function. Furthermore, it can be performed by laparoscopy in children as it is a safe procedure, offering the benefits of a minimally invasive approach. MATERIALS AND METHODS: The software VR-render LE version 0.81 is a system that enables the visualization of bidimentional 3D images with magnification of anatomical details. We have applied this system to five cases of non-parasitic splenic cysts before laparoscopic partial splenectomy. RESULTS: The images obtained with VR rendering software permitted the preoperative reconstruction of the vascularization of the splenic hilum, allowing the surgeon safe vessel control during laparoscopic procedures. All five partial splenectomies were carried out with no complications or major blood loss. CONCLUSIONS: Laparoscopic partial splenectomy should be a first choice procedure because it is feasible, reproducible, and safe for children; furthermore, it preserves enough splenic tissue thereby preventing post-splenectomy infections. Volume rendering provides high anatomical resolution and can be useful in guiding the surgical procedure.
Resumo:
Objective: To implement a carotid sparing protocol using helical Tomotherapy(HT) in T1N0 squamous-cell laryngeal carcinoma.Materials/Methods: Between July and August 2010, 7 men with stage T1N0 laryngeal carcinoma were included in this study. Age ranged from 47-74 years. Staging included endoscopic examination, CT-scan and MRI when indicated.Planned irradiation dose was 70 Gy in 35 fractions over 7 weeks. A simple treatment planning algorithm for carotidsparing was used: maximum point dose to the carotids 35 Gy, to the spinal cord 30 Gy, and 100% PTV volume to becovered with 95% of the prescribed dose. Carotid volume of interest extended to 1 cm above and below of the PTV.Doses to the carotid arteries, critical organs, and planned target volume (PTV) with our standard laryngealirradiation protocol was compared. Daily megavoltage scans were obtained before each fraction. When necessary, thePlanned Adaptive? software (TomoTherapy Inc., Madison, WI) was used to evaluate the need for a re-planning,which has never been indicated. Dose data were extracted using the VelocityAI software (Atlanta, GA), and datanormalization and dosevolume histogram (DVH) interpolation were realized using the Igor Pro software (Portland,OR).Results: A significant (p < 0.05) carotid dose sparing compared to our standard protocol with an average maximum point dose of 38.3 Gy (standard devaition [SD] 4.05 Gy), average mean dose of 18.59 Gy (SD 0.83 Gy) was achieved.In all patients, 95% of the carotid volume received less than 28.4 Gy (SD 0.98 Gy). The average maximum point doseto the spinal cord was 25.8 Gy (SD 3.24 Gy). PTV was fully covered with more than 95% of the prescribed dose forall patients with an average maximum point dose of 74.1 Gy and the absolute maximum dose in a single patient of75.2 Gy. To date, the clinical outcomes have been excellent. Three patients (42%) developed stage 1 mucositis that was conservatively managed, and all the patients presented a mild to moderate dysphonia. All adverse effectsresolved spontaneously in the month following the end of treatment. Early local control rate is 100% considering a 4-5months post treatment follow-up.Conclusions: HT allows a clinically significant decrease of carotid irradiation dose compared tostandard irradiation protocols with an acceptable spinal cord dose tradeoff. Moreover, this technique allows the PTV to be homogenously covered with a curative irradiation dose. Daily control imaging brings added security marginsespecially when working with high dose gradients. Further investigations and follow-up are underway to better evaluatethe late clinical outcomes especially the local control rate, late laryngeal and vascular toxicity, and expected potentialimpact on cerebrovascular events.
Resumo:
We investigate the population genetic structure of the Maghrebian bat, Myotis punicus, between the mainland and islands to assess the island colonization pattern and current gene flow between nearby islands and within the mainland. Location North Africa and the Mediterranean islands of Corsica and Sardinia. Methods We sequenced part of the control region (HVII) of 79 bats across 11 colonies. The phylogeographical pattern was assessed by analysing molecular diversity indices, examining differentiation among populations and estimating divergence time. In addition, we genotyped 182 bats across 10 colonies at seven microsatellite loci. We used analysis of molecular variance and a Bayesian approach to infer nuclear population structure. Finally, we estimated sex-specific dispersal between Corsica and Sardinia. Results Mitochondrial analyses indicated that colonies between Corsica, Sardinia and North Africa are highly differentiated. Within islands there was no difference between colonies, while at the continental level Moroccan and Tunisian populations were highly differentiated. Analyses with seven microsatellite loci showed a similar pattern. The sole difference was the lack of nuclear differentiation between populations in North Africa, suggesting a male-biased dispersal over the continental area. The divergence time of Sardinian and Corsican populations was estimated to date back to the early and mid-Pleistocene. Main conclusions Island colonization by the Maghrebian bats seems to have occurred in a stepping-stone manner and certainly pre-dated human colonization. Currently, open water seems to prevent exchange of bats between the two islands, despite their ability to fly and the narrowness of the strait of Bonifacio. Corsican and Sardinian populations are thus currently isolated from any continental gene pool and must therefore be considered as different evolutionarily significant units (ESU).
Resumo:
PURPOSE: To determine the characteristics specific to boys with disordered eating behaviors (DEB) and the general context in which these DEB occur. METHOD: Data were drawn from the SMASH02 database, a survey carried out among post-mandatory school students in Switzerland aged 16-20 years in 2002. Only males (N=3890) were included, and were classified into into one of four groups based on their level of concern about weight/food and on their eating behaviors, as follows: group 1: one concern without behavior (N=862); group 2: more than one concern without behavior (N=361); group 3: at least one behavior (N=798); and a control group (N=1869), according to previously validated items. Groups were compared for personal, family, school, experience of violence, and health-compromising behaviors variables on the bivariate level. All significant variables were included in a multinomial logistic regression using Stata 9 software. RESULTS: About one-half of the boys reported either a concern or unhealthy eating behavior. Compared with the control group, boys from the three groups were more likely to be students and to report a history of sexual abuse, delinquency, depression, and feeling fat. In addition, boys from group 3 were more likely to report a history of dieting, early puberty, peer teasing, having experienced violence, frequent inebriation, and being overweight. CONCLUSION: DEB concern adolescent males more frequently than thought and seem to be integrated in a general dysfunctional context, in which violence is predominant. Adolescent males also need to be screened for DEB. Moreover, prevention programs should target the increasing social and media pressure regarding boys ideal body shape and raise public consciousness about this phenomenon.
Resumo:
Motivation: Hormone pathway interactions are crucial in shaping plant development, such as synergism between the auxin and brassinosteroid pathways in cell elongation. Both hormone pathways have been characterized in detail, revealing several feedback loops. The complexity of this network, combined with a shortage of kinetic data, renders its quantitative analysis virtually impossible at present.Results: As a first step towards overcoming these obstacles, we analyzed the network using a Boolean logic approach to build models of auxin and brassinosteroid signaling, and their interaction. To compare these discrete dynamic models across conditions, we transformed them into qualitative continuous systems, which predict network component states more accurately and can accommodate kinetic data as they become available. To this end, we developed an extension for the SQUAD software, allowing semi-quantitative analysis of network states. Contrasting the developmental output depending on cell type-specific modulators enabled us to identify a most parsimonious model, which explains initially paradoxical mutant phenotypes and revealed a novel physiological feature.
Resumo:
Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.
Resumo:
The present article examines the meaning and function of olfactory remnants, often repugnant, linked to demons in the context of late medieval witchcraft and demonology. This reflection is developed within the framework of a «make believe» logic sustained by the doctrinal, theological, narrative and judiciary constructions of the witches' Sabbath. Incorporated within the order of sensory perception, references to the fetid smell of demons - who are by nature devoid of odour because they are pure spirits - constitute further proofs bearing witness to demonic presence, and thus testifying to the ignominy of the crime of witchcraft and to the guiltiness of the accused. According to those who attacked demon worshippers, the devil truly revealed himself physically; human beings were able to touch, hear, see and smell him. Sensory faculties were therefore perceived as being instrumental in corroborating the existence and reality of the Sabbath and the presence of the devil in bodily form. These considerations bring us to examine the olfactory fields associated with the devil's odour: odour of corpses, hell, sin, deviance, but also of defilement, impurity, corruption and excrements. These fetid odours are embedded in a logic of moral, spiritual and religious inversion of positive odours, such as the «sweet fragrance» of the saints, the «pure odour» of Christ or the «soft perfume» of virtue.
Resumo:
OBJECTIVES: The aims of the study were to use cone beam computed tomography (CBCT) images of nasopalatine duct cysts (NPDC) and to calculate the diameter, surface area, and 3D-volume using a custom-made software program. Furthermore, any associations of dimensions of NPDC with age, gender, presence/absence of maxillary incisors/canines (MI/MC), endodontic treatment of MI/MC, presenting symptoms, and postoperative complications were evaluated. MATERIAL AND METHODS: The study comprised 40 patients with a histopathologically confirmed NPDC. On preoperative CBCT scans, curves delineating the cystic borders were drawn in all planes and the widest diameter (in millimeter), surface area (in square millimeter), and volume (in cubic millimeter) were calculated. RESULTS: The overall mean cyst diameter was 15 mm (range 7-47 mm), the mean cyst surface area 566 mm(2) (84-4,516 mm(2)), and the mean cyst volume 1,735 mm(3) (65-25,350 mm(3)). For 22 randomly allocated cases, a second measurement resulted in a mean absolute aberration of ±4.2 % for the volume, ±2.8 % for the surface, and ±4.9 % for the diameter. A statistically significant association was found for the CBCT determined cyst measurements and the need for preoperative endodontic treatment to MI/MC and for postoperative complications. CONCLUSION: In the hands of a single experienced operator, the novel software exhibited high repeatability for measurements of cyst dimensions. Further studies are needed to assess the application of this tool for dimensional analysis of different jaw cysts and lesions including treatment planning. CLINICAL RELEVANCE: Accurate radiographic information of the bone volume lost (osteolysis) due to expansion of a cystic lesion in three dimensions could help in personalized treatment planning.
Resumo:
The quality of sample inoculation is critical for achieving an optimal yield of discrete colonies in both monomicrobial and polymicrobial samples to perform identification and antibiotic susceptibility testing. Consequently, we compared the performance between the InoqulA (BD Kiestra), the WASP (Copan), and manual inoculation methods. Defined mono- and polymicrobial samples of 4 bacterial species and cloudy urine specimens were inoculated on chromogenic agar by the InoqulA, the WASP, and manual methods. Images taken with ImagA (BD Kiestra) were analyzed with the VisionLab version 3.43 image analysis software to assess the quality of growth and to prevent subjective interpretation of the data. A 3- to 10-fold higher yield of discrete colonies was observed following automated inoculation with both the InoqulA and WASP systems than that with manual inoculation. The difference in performance between automated and manual inoculation was mainly observed at concentrations of >10(6) bacteria/ml. Inoculation with the InoqulA system allowed us to obtain significantly more discrete colonies than the WASP system at concentrations of >10(7) bacteria/ml. However, the level of difference observed was bacterial species dependent. Discrete colonies of bacteria present in 100- to 1,000-fold lower concentrations than the most concentrated populations in defined polymicrobial samples were not reproducibly recovered, even with the automated systems. The analysis of cloudy urine specimens showed that InoqulA inoculation provided a statistically significantly higher number of discrete colonies than that with WASP and manual inoculation. Consequently, the automated InoqulA inoculation greatly decreased the requirement for bacterial subculture and thus resulted in a significant reduction in the time to results, laboratory workload, and laboratory costs.