57 resultados para blended workflow
Resumo:
Anerkannte Wissenschaftler und Praktiker untersuchen moderne Formen und Perspektiven der Personalentwicklung. Fragen zu Performance Managements, Feedbacksystemen, Coaching, Mentoring, E-Learning und Blended Learning, alternativen Laufbahnmodellen und vielen weiteren Aspekten der modernen Personalentwicklung werden fundiert und praxisorientiert beantwortet.
Resumo:
The generic approach of the Spine Tango documentation system, which uses web-based technologies, is a necessity for reaching a maximum number of participants. This, in turn, reduces the potential for customising the Tango according to the individual needs of each user. However, a number of possibilities still exist for tailoring the data collection processes to the user's own hospital workflow. One can choose between a purely paper-based set-up (with in-house scanning, data punching or mailing of forms to the data centre at the University of Bern) and completely paper-free online data entry. Many users work in a hybrid mode with online entry of surgical data and paper-based recording of the patients' perspectives using the Core Outcome Measures Index (COMI) questionnaires. Preoperatively, patients can complete their questionnaires in the outpatient clinic at the time of taking the decision about surgery or simply at the time of hospitalisation. Postoperative administration of patient data can involve questionnaire completion in the outpatient clinic, the handing over the forms at the time of discharge for their mailing back to the hospital later, sending out of questionnaires by post with a stamped addressed envelope for their return or, in exceptional circumstances, conducting telephone interviews. Eurospine encourages documentation of patient-based information before the hospitalisation period and surgeon-based information both before and during hospitalisation; both patient and surgeon data should be acquired for at least one follow-up, at a minimum of three to six months after surgery. In addition, all complications that occur after discharge, and their consequences should be recorded.
Resumo:
We present an optimized multilocus sequence typing (MLST) scheme with universal primer sets for amplifying and sequencing the seven target genes of Campylobacter jejuni and Campylobacter coli. Typing was expanded by sequence determination of the genes flaA and flaB using optimized primer sets. This approach is compatible with the MLST and flaA schemes used in the PubMLST database and results in an additional typing method using the flaB gene sequence. An identification module based on the 16S rRNA and rpoB genes was included, as well as the genetic determination of macrolide and quinolone resistances based on mutations in the 23S rRNA and gyrA genes. Experimental procedures were simplified by multiplex PCR of the 13 target genes. This comprehensive approach was evaluated with C. jejuni and C. coli isolates collected in Switzerland. MLST of 329 strains resulted in 72 sequence types (STs) among the 186 C. jejuni strains and 39 STs for the 143 C. coli isolates. Fourteen (19%) of the C. jejuni and 20 (51%) of the C. coli STs had not been found previously. In total, 35% of the C. coli strains collected in Switzerland contained mutations conferring antibiotic resistance only to quinolone, 15% contained mutations conferring resistance only to macrolides, and 6% contained mutations conferring resistance to both classes of antibiotics. In C. jejuni, these values were 31% and 0% for quinolone and macrolide resistance, respectively. The rpoB sequence allowed phylogenetic differentiation between C. coli and C. jejuni, which was not possible by 16S rRNA gene analysis. An online Integrated Database Network System (SmartGene, Zug, Switzerland)-based platform for MLST data analysis specific to Campylobacter was implemented. This Web-based platform allowed automated allele and ST designation, as well as epidemiological analysis of data, thus streamlining and facilitating the analysis workflow. Data networking facilitates the exchange of information between collaborating centers. The described approach simplifies and improves the genotyping of Campylobacter, allowing cost- and time-efficient routine monitoring.
Resumo:
Quantitative data obtained by means of design-based stereology can add valuable information to studies performed on a diversity of organs, in particular when correlated to functional/physiological and biochemical data. Design-based stereology is based on a sound statistical background and can be used to generate accurate data which are in line with principles of good laboratory practice. In addition, by adjusting the study design an appropriate precision can be achieved to find relevant differences between groups. For the success of the stereological assessment detailed planning is necessary. In this review we focus on common pitfalls encountered during stereological assessment. An exemplary workflow is included, and based on authentic examples, we illustrate a number of sampling principles which can be implemented to obtain properly sampled tissue blocks for various purposes.
Resumo:
HYPOTHESIS A previously developed image-guided robot system can safely drill a tunnel from the lateral mastoid surface, through the facial recess, to the middle ear, as a viable alternative to conventional mastoidectomy for cochlear electrode insertion. BACKGROUND Direct cochlear access (DCA) provides a minimally invasive tunnel from the lateral surface of the mastoid through the facial recess to the middle ear for cochlear electrode insertion. A safe and effective tunnel drilled through the narrow facial recess requires a highly accurate image-guided surgical system. Previous attempts have relied on patient-specific templates and robotic systems to guide drilling tools. In this study, we report on improvements made to an image-guided surgical robot system developed specifically for this purpose and the resulting accuracy achieved in vitro. MATERIALS AND METHODS The proposed image-guided robotic DCA procedure was carried out bilaterally on 4 whole head cadaver specimens. Specimens were implanted with titanium fiducial markers and imaged with cone-beam CT. A preoperative plan was created using a custom software package wherein relevant anatomical structures of the facial recess were segmented, and a drill trajectory targeting the round window was defined. Patient-to-image registration was performed with the custom robot system to reference the preoperative plan, and the DCA tunnel was drilled in 3 stages with progressively longer drill bits. The position of the drilled tunnel was defined as a line fitted to a point cloud of the segmented tunnel using principle component analysis (PCA function in MatLab). The accuracy of the DCA was then assessed by coregistering preoperative and postoperative image data and measuring the deviation of the drilled tunnel from the plan. The final step of electrode insertion was also performed through the DCA tunnel after manual removal of the promontory through the external auditory canal. RESULTS Drilling error was defined as the lateral deviation of the tool in the plane perpendicular to the drill axis (excluding depth error). Errors of 0.08 ± 0.05 mm and 0.15 ± 0.08 mm were measured on the lateral mastoid surface and at the target on the round window, respectively (n =8). Full electrode insertion was possible for 7 cases. In 1 case, the electrode was partially inserted with 1 contact pair external to the cochlea. CONCLUSION The purpose-built robot system was able to perform a safe and reliable DCA for cochlear implantation. The workflow implemented in this study mimics the envisioned clinical procedure showing the feasibility of future clinical implementation.
Resumo:
HYPOTHESIS Facial nerve monitoring can be used synchronous with a high-precision robotic tool as a functional warning to prevent of a collision of the drill bit with the facial nerve during direct cochlear access (DCA). BACKGROUND Minimally invasive direct cochlear access (DCA) aims to eliminate the need for a mastoidectomy by drilling a small tunnel through the facial recess to the cochlea with the aid of stereotactic tool guidance. Because the procedure is performed in a blind manner, structures such as the facial nerve are at risk. Neuromonitoring is a commonly used tool to help surgeons identify the facial nerve (FN) during routine surgical procedures in the mastoid. Recently, neuromonitoring technology was integrated into a commercially available drill system enabling real-time monitoring of the FN. The objective of this study was to determine if this drilling system could be used to warn of an impending collision with the FN during robot-assisted DCA. MATERIALS AND METHODS The sheep was chosen as a suitable model for this study because of its similarity to the human ear anatomy. The same surgical workflow applicable to human patients was performed in the animal model. Bone screws, serving as reference fiducials, were placed in the skull near the ear canal. The sheep head was imaged using a computed tomographic scanner and segmentation of FN, mastoid, and other relevant structures as well as planning of drilling trajectories was carried out using a dedicated software tool. During the actual procedure, a surgical drill system was connected to a nerve monitor and guided by a custom built robot system. As the planned trajectories were drilled, stimulation and EMG response signals were recorded. A postoperative analysis was achieved after each surgery to determine the actual drilled positions. RESULTS Using the calibrated pose synchronized with the EMG signals, the precise relationship between distance to FN and EMG with 3 different stimulation intensities could be determined for 11 different tunnels drilled in 3 different subjects. CONCLUSION From the results, it was determined that the current implementation of the neuromonitoring system lacks sensitivity and repeatability necessary to be used as a warning device in robotic DCA. We hypothesize that this is primarily because of the stimulation pattern achieved using a noninsulated drill as a stimulating probe. Further work is necessary to determine whether specific changes to the design can improve the sensitivity and specificity.
Resumo:
Numerical simulation experiments give insight into the evolving energy partitioning during high-strain torsion experiments of calcite. Our numerical experiments are designed to derive a generic macroscopic grain size sensitive flow law capable of describing the full evolution from the transient regime to steady state. The transient regime is crucial for understanding the importance of micro structural processes that may lead to strain localization phenomena in deforming materials. This is particularly important in geological and geodynamic applications where the phenomenon of strain localization happens outside the time frame that can be observed under controlled laboratory conditions. Ourmethod is based on an extension of the paleowattmeter approach to the transient regime. We add an empirical hardening law using the Ramberg-Osgood approximation and assess the experiments by an evolution test function of stored over dissipated energy (lambda factor). Parameter studies of, strain hardening, dislocation creep parameter, strain rates, temperature, and lambda factor as well asmesh sensitivity are presented to explore the sensitivity of the newly derived transient/steady state flow law. Our analysis can be seen as one of the first steps in a hybrid computational-laboratory-field modeling workflow. The analysis could be improved through independent verifications by thermographic analysis in physical laboratory experiments to independently assess lambda factor evolution under laboratory conditions.
Resumo:
Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.