992 resultados para Six Sigma Technique


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mismatching of alveolar ventilation and perfusion (VA/Q) is the major determinant of impaired gas exchange. The gold standard for measuring VA/Q distributions is based on measurements of the elimination and retention of infused inert gases. Conventional multiple inert gas elimination technique (MIGET) uses gas chromatography (GC) to measure the inert gas partial pressures, which requires tonometry of blood samples with a gas that can then be injected into the chromatograph. The method is laborious and requires meticulous care. A new technique based on micropore membrane inlet mass spectrometry (MMIMS) facilitates the handling of blood and gas samples and provides nearly real-time analysis. In this study we compared MIGET by GC and MMIMS in 10 piglets: 1) 3 with healthy lungs; 2) 4 with oleic acid injury; and 3) 3 with isolated left lower lobe ventilation. The different protocols ensured a large range of normal and abnormal VA/Q distributions. Eight inert gases (SF6, krypton, ethane, cyclopropane, desflurane, enflurane, diethyl ether, and acetone) were infused; six of these gases were measured with MMIMS, and six were measured with GC. We found close agreement of retention and excretion of the gases and the constructed VA/Q distributions between GC and MMIMS, and predicted PaO2 from both methods compared well with measured PaO2. VA/Q by GC produced more widely dispersed modes than MMIMS, explained in part by differences in the algorithms used to calculate VA/Q distributions. In conclusion, MMIMS enables faster measurement of VA/Q, is less demanding than GC, and produces comparable results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since October 2011, the enzymatic lysis of Dupuytren's cord was introduced in Switzerland (Xiapex(®), Auxilium Pharmaceuticals, Pfizer). Here we present our first university experience and underline the major role of ultrasound during the injection. Between December 2011 and February 2013, 52 injections were performed to eliminate 43 Dupuytren's cords in 33 patients. The mean age of the patients was 64.4 ± 8.5 years. Complications were documented for each patient. Before, directly after and after a minimum of 6 months post-injection, the contracture of the treated joint was measured with use of a goniometer. The DASH score was evaluated after a minimum of 6 months and the patients were asked to subjectively evaluate the outcome of the treatment (very good, good, mild, poor) and whether they would reiterate it if necessary. Four skin defects, one lymphangitis, and one CRPS were responsible for a complication rate of 18%. There was no infection and no tendon rupture in the series. The mean MCP joint contracture was respectively 36.8 ± 27.4°, 3.5 ± 7.8° (gain of mobility compared to the preoperative situation 33.3°, P<0.001), and 8.4 ± 13.9° (gain 28.4°, P<0.001) respectively before, just after and at the long-term clinical control. The mean PIP joint contracture was respectively 36.5 ± 29.1°, 5.9 ± 6.7° (gain 30.6°, P<0.001), and 15.1 ± 13.8° (gain 21.4°, P<0.001) respectively before injection, just after and at the long-term clinical control. The DASH score decreased from 24 ± 14 to 7 ± 9 (P<0.001). Eighty-one per cent of the patients were satisfied or very satisfied of the treatment. All but two would reiterate the treatment if necessary. Ultrasound is able to target the injection of collagenase in order to reduce complications. The short-term results of this non-invasive therapy are very promising however comparison with conventional procedures is difficult as the long-term results are lacking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract BACKGROUND: The purpose of this paper is to describe the transdiaphragmatic approach to the heart for open CPR in patients that arrest at laparotomy and to present a first case series of patients that have undergone this procedure. METHODS: All patients who had undergone intraperitoneal transdiaphragmatic open CPR between January 1, 2002 and December 31, 2012 were retrieved from the operation registry at Bern University Hospital, Switzerland. Transdiaphragmatic access to the heart is initiated with a 10-cm-long anterocaudal incision in the central tendon of the diaphragm--approximately at 2 o'clock. Internal cardiac compression through the diaphragmatic incision can be performed from both sides of the patient. From the right side of the patient, cardiac massage is performed with the right hand and vice versa. RESULTS: A total of six patients were identified that suffered cardiac arrest during laparotomy with open CPR performed through the transdiaphragmatic approach. Four patients suffered cardiac arrest during orthotopic liver transplantation and two trauma patients suffered cardiac arrest during damage control laparotomy. In three patients, cardiac activity was never reestablished. However, three patients regained a perfusion heart rhythm and two of these survived to the ICU. One patient ultimately survived to discharge. CONCLUSIONS: In patients suffering cardiac arrest during laparotomy, the transdiaphragmatic approach allows for a rapid, technically easy, and almost atraumatic access to the heart, with excellent CPR performance. After this potentially life-saving procedure, pulmonary or surgical site complications are expected to occur much less compared with the conventionally performed emergency department left-sided thoracotomy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dataset contains the revised age models and foraminiferal records obtained for the Last Interglacial period in six marine sediment cores: - the Southern Ocean core MD02-2488 (age model, sea surface temperatures, benthic d18O and d13C for the period 136-108 ka), - the North Atlantic core MD95-2042 (age model, planktic d18O, benthic d18O and d13C for the period 135-110 ka), - the North Atlantic core ODP 980 (age model, planktic d18O, sea surface temperatures, seawater d18O, benthic d18O and d13C, ice-rafted detritus for the period 135-110 ka), - the North Atlantic core CH69-K09 (age model, planktic d18O, sea surface temperatures, seawater d18O, benthic d18O and d13C, ice-rafted detritus for the period 135-110 ka), - the Norwegian Sea core MD95-2010 (age model, percentage of Neogloboquadrina pachyderma sinistral, sea surface temperatures, benthic d18O, ice-rafted detritus for the period 134-110 ka), - the Labrador Sea core EW9302-JPC2 (age model, percentage of Neogloboquadrina pachyderma sinistral, sea surface temperatures, benthic d18O for the period 134-110 ka).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Detrital K-feldspars and muscovites from Ocean Drilling Program Leg 116 cores that have depositional ages from 0 to 18 Ma have been dated by the 40Ar/39Ar technique. Four to thirteen individual K-feldspars have been dated from seven stratigraphic levels, each of which have a very large range, up to 1660 Ma. At each level investigated, at least one K-feldspar yielded an age minimum which is, within uncertainty, identical to the age of deposition. One to twelve single muscovite crystals from each of six levels have also been studied. The range of muscovite ages is less than that of the K-feldspars and, with one exception, reveal only a 20-Ma spread in ages. As with the K-feldspars, each level investigated contains muscovites with mineral ages essentially identical to depositional ages. These results indicate that a significant portion of the material in the Bengal Fan is first-cycle detritus derived from the Himalayas. Therefore, the significant proportion of sediment deposited in the distal fan in the early to mid Miocene can be ascribed to a significant pulse of uplift and erosion in the collision zone. Moreover, these data indicate that during the entire Neogene, some portion of the Himalayan orogen was experiencing rapid erosion (<= uplift). The lack of granulite facies rocks in the eastern Himalayas and Tibetan Plateau suggests that very rapid uplift must have been distributed in brief pulses in different places in the mountain belt. We suggest that the great majority of the crystals with young apparent ages have been derived from the southern slope of the Himalayas, predominantly from near the main central thrust zone. These data provide further evidence against tectonic models in which the Himalayas and Tibetan plateaus are uplifted either uniformly during the past 40 m.y. or mostly within the last 2 to 5 m.y.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article focuses on the evaluation of a biometric technique based on the performance of an identifying gesture by holding a telephone with an embedded accelerometer in his/her hand. The acceleration signals obtained when users perform gestures are analyzed following a mathematical method based on global sequence alignment. In this article, eight different scores are proposed and evaluated in order to quantify the differences between gestures, obtaining an optimal EER result of 3.42% when analyzing a random set of 40 users of a database made up of 80 users with real attempts of falsification. Moreover, a temporal study of the technique is presented leeding to the need to update the template to adapt the manner in which users modify how they perform their identifying gesture over time. Six updating schemes have been assessed within a database of 22 users repeating their identifying gesture in 20 sessions over 4 months, concluding that the more often the template is updated the better and more stable performance the technique presents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The penalty corner is one of the most important goal plays in field hockey. The drag-flick is used less by women than men in a penalty corner. The aim of this study was to describe training-induced changes in the drag-flick technique in female field hockey players. Four female players participated in the study. The VICON optoelectronic system (Oxford Metrics, Oxford, UK) measured the kinematic parameters of the drag-flick with six cameras sampling at 250 Hz, prior to and after training. Fifteen shots were captured for each subject. A Wilcoxon test assessed the differences between pre-training and post-training parameters. Two players received specific training twice a week for 8 weeks; the other two players did not train. The proposed drills improved the position of the stick at the beginning of the shot (p<0.05), the total distance of the shot (p<0.05)and the rotation radius at ball release (p<0.01). It was noted that all players had lost speed of the previous run. Further studies should include a larger sample, in order to provide more information on field hockey performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have cloned the gene for a putative chloroplast RNA polymerase sigma factor from the unicellular rhodophyte Cyanidium caldarium. This gene contains an open reading frame encoding a protein of 609 amino acids with domains highly homologous to all four conserved regions found in bacterial and cyanobacterial sigma 70-type subunits. When Southern blots of genomic DNA were hybridized to the "rpoD box" oligonucleotide probe, up to six hybridizing hands were observed. Transcripts of the sigma factor gene were undetectable in RNA from dark-grown cells but were abundant in the poly(A)+ fraction of RNA from illuminated cells. The sigma factor gene was expressed in Escherichia coli, and antibodies against the expressed sigma factor fusion protein cross-reacted with a 55-kDa protein in partially purified chloroplast RNA polymerase. Antibodies directed against a cyanobacterial RNA polymerase sigma factor also cross-reacted with a 55-kDa protein in the same enzyme preparation. Immunoprecipitation experiments showed that this enzyme preparation contains proteins with the same molecular weights as the alpha, beta, beta', and beta" subunits of chloroplast RNA polymerase in higher plants. This study identifies a gene for a plastid RNA polymerase sigma factor and indicates that there may be a family of nuclear-encoded sigma factors that recognize promoters in subsets of plastid genes and regulate differential gene expression at the transcriptional level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a reverse-transcriptase PCR-based protocol suitable for efficient expression analysis of multigene families is presented. The method combines restriction fragment length polymorphism (RFLP) technology with a gene family-specific version of mRNA differential display and hence is called "RFLP-coupled domain-directed differential display. "With this method, expression of all members of a multigene family at many different developmental stages, in diverse tissues and even in different organisms, can be displayed on one gel. Moreover, bands of interest, representing gene family members, are directly accessible to sequence analysis, without the need for subcloning. The method thus enables a detailed, high-resolution expression analysis of known gene family members as well as the identification and characterization of new ones. Here the technique was used to analyze differential expression of MADS-box genes in male and female inflorescences of maize (Zea mays ssp. mays). Six different MADS-box genes could be identified, being either specifically expressed in the female sex or preferentially expressed in male or female inflorescences, respectively. Other possible applications of the method are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A elevada concentração de cloro das bifenilas policloradas provoca alta toxicidade do composto, o qual dificulta sua biodegradação. A contaminação de PCB no Brasil foi confirmada em estudo realizado na Bahia de Santos-São Vicente (São Paulo), o qual revelou a necessidade de um plano de ação para o controle e remoção de PCB no Brasil. Pretendeu-se assim, na realização da presente pesquisa, verificar quatro hipóteses: (1) A técnica de Microextração em fase sólida é uma metodologia eficaz para avaliação de bifenilas policloradas de amostras de reatores; (2) A condição fermentativa-metanogênica abriga comunidade resistente ao PCB, e removê-lo; (3) A condição desnitrificante abriga comunidade resistente ao PCB, e removê-lo e (4) A remoção de PCB, bem como, a composição microbiana é distinta em cada condição metabólica. Para tanto, reatores em batelada foram montados separadamente com biomassa anaeróbia proveniente de reator UASB usado no tratamento de água residuária de avicultura e biomassa de sistemas de lodos ativados de tratamento de esgoto sanitário. Os reatores operados em condição mesófila foram alimentados com meio sintético, co-substratos, sendo etanol (457 mg.L-1) e formiato de sódio (680 mg.L-1) para os reatores anaeróbios, e somente etanol (598 mg.L-1) para os reatores anóxicos, além de PCB padrão Sigma (congêneres PCBs 10, 28, 52, 153, 138 e 180) em diferentes concentrações, dependendo do objetivo do ensaio. A aplicação do método de extração por SPME com análise em cromatógrafo gasoso com detector por captura de elétrons foi adequada para a determinação dos seis congêneres de PCB. Obteve-se ampla faixa de linearidade, seletividade frente aos vários interferentes, além da robustez do método, utilidade e confiabilidade na identificação e quantificação específica dos seis congêneres de PCB. A Hipótese 1 foi aceita; ou seja, por meio da aplicação da metodologia SPME foi possível quantificar os PCB nos reatores em batelada. Apesar de ter sido comprovada a inibição metanogênica na presença de PCB, com IC50 de 0,03 mg.L

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last decade the use of randomised gene libraries has had an enormous impact in the field of protein engineering. Such libraries comprise many variations of a single gene in which codon replacements are used to substitute key residues of the encoded protein. The expression of such libraries generates a library of randomised proteins which can subsequently be screened for desired or novel activities. Randomisation in this fashion has predominantly been achieved by the inclusion of the codons NNN or NNGCor T, in which N represents any of the four bases A,C,G, or T. The use of thesis codons however, necessities the cloning of redundant codons at each position of randomisation, in addition to those required to encode the twenty possible amino acid substitutions. As degenerate codons must be included at each position of randomisation, this results in a progressive loss of randomisation efficiency as the number of randomised positions is increased. The ratio of genes to proteins in these libraries rises exponentially with each position of randomisation, creating large gene libraries, which generate protein libraries of limited diversity upon expression. In addition to these problems of library size, the cloning of redundant codons also results in the generation of protein libraries in which substituted amino acids are unevenly represented. As several of the randomised codons may encode the same amino acid, for example serine which is encoded six time using the codon NNN, an inherent bias may be introduced into the resulting protein library during the randomisation procedure. The work outlined here describes the development of a novel randomisation technique aimed at a eliminating codon redundancy from randomised gene libraries, thus addressing the problems of library size and bias, associated with the cloning of redundant codons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The aim was to investigate the effect on the measured amplitude of accommodation and repeatability of using the minus lens technique with the target at distance or near. Methods: Forty-three students (average age: 21.17 ± 1.50 years, 35 female) had their amplitude of accommodation measured with minus lenses on top of their distance correction in a trial frame with the target at far (6.0m) or near (0.4m). The minus lens power was gradually added with steps of 0.25D. Measurements were taken on two occasions at each distance, which were separated by a time interval of at least 24 hours. Results: The measured amplitude at six metres was significantly lower than that with the target at 40cm, by 1.56 ± 1.17D (p < 0.001) and this varied between individuals (r = 0.716, intraclass correlation coefficient = 0.439). With either target distance, repeated measurement was highly correlated (r > 0.9) but the agreement was better at 6.0m (±0.74D) than at 40cm (± 0.92D). Conclusion: The measurements of the amplitude of accommodation with the minus lens technique using targets at far or near are not comparable and the difference between the target distances may provide clinically relevant information. © 2013 Optometrists Association Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The accurate identification of T-cell epitopes remains a principal goal of bioinformatics within immunology. As the immunogenicity of peptide epitopes is dependent on their binding to major histocompatibility complex (MHC) molecules, the prediction of binding affinity is a prerequisite to the reliable prediction of epitopes. The iterative self-consistent (ISC) partial-least-squares (PLS)-based additive method is a recently developed bioinformatic approach for predicting class II peptide−MHC binding affinity. The ISC−PLS method overcomes many of the conceptual difficulties inherent in the prediction of class II peptide−MHC affinity, such as the binding of a mixed population of peptide lengths due to the open-ended class II binding site. The method has applications in both the accurate prediction of class II epitopes and the manipulation of affinity for heteroclitic and competitor peptides. The method is applied here to six class II mouse alleles (I-Ab, I-Ad, I-Ak, I-As, I-Ed, and I-Ek) and included peptides up to 25 amino acids in length. A series of regression equations highlighting the quantitative contributions of individual amino acids at each peptide position was established. The initial model for each allele exhibited only moderate predictivity. Once the set of selected peptide subsequences had converged, the final models exhibited a satisfactory predictive power. Convergence was reached between the 4th and 17th iterations, and the leave-one-out cross-validation statistical terms - q2, SEP, and NC - ranged between 0.732 and 0.925, 0.418 and 0.816, and 1 and 6, respectively. The non-cross-validated statistical terms r2 and SEE ranged between 0.98 and 0.995 and 0.089 and 0.180, respectively. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made freely available online (http://www.jenner.ac.uk/MHCPred).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new input devices, such as multi-touch surface displays, the Nintendo WiiMote, the Microsoft Kinect, and the Leap Motion sensor, among others, the field of Human-Computer Interaction (HCI) finds itself at an important crossroads that requires solving new challenges. Given the amount of three-dimensional (3D) data available today, 3D navigation plays an important role in 3D User Interfaces (3DUI). This dissertation deals with multi-touch, 3D navigation, and how users can explore 3D virtual worlds using a multi-touch, non-stereo, desktop display. ^ The contributions of this dissertation include a feature-extraction algorithm for multi-touch displays (FETOUCH), a multi-touch and gyroscope interaction technique (GyroTouch), a theoretical model for multi-touch interaction using high-level Petri Nets (PeNTa), an algorithm to resolve ambiguities in the multi-touch gesture classification process (Yield), a proposed technique for navigational experiments (FaNS), a proposed gesture (Hold-and-Roll), and an experiment prototype for 3D navigation (3DNav). The verification experiment for 3DNav was conducted with 30 human-subjects of both genders. The experiment used the 3DNav prototype to present a pseudo-universe, where each user was required to find five objects using the multi-touch display and five objects using a game controller (GamePad). For the multi-touch display, 3DNav used a commercial library called GestureWorks in conjunction with Yield to resolve the ambiguity posed by the multiplicity of gestures reported by the initial classification. The experiment compared both devices. The task completion time with multi-touch was slightly shorter, but the difference was not statistically significant. The design of experiment also included an equation that determined the level of video game console expertise of the subjects, which was used to break down users into two groups: casual users and experienced users. The study found that experienced gamers performed significantly faster with the GamePad than casual users. When looking at the groups separately, casual gamers performed significantly better using the multi-touch display, compared to the GamePad. Additional results are found in this dissertation.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the introduction of new input devices, such as multi-touch surface displays, the Nintendo WiiMote, the Microsoft Kinect, and the Leap Motion sensor, among others, the field of Human-Computer Interaction (HCI) finds itself at an important crossroads that requires solving new challenges. Given the amount of three-dimensional (3D) data available today, 3D navigation plays an important role in 3D User Interfaces (3DUI). This dissertation deals with multi-touch, 3D navigation, and how users can explore 3D virtual worlds using a multi-touch, non-stereo, desktop display. The contributions of this dissertation include a feature-extraction algorithm for multi-touch displays (FETOUCH), a multi-touch and gyroscope interaction technique (GyroTouch), a theoretical model for multi-touch interaction using high-level Petri Nets (PeNTa), an algorithm to resolve ambiguities in the multi-touch gesture classification process (Yield), a proposed technique for navigational experiments (FaNS), a proposed gesture (Hold-and-Roll), and an experiment prototype for 3D navigation (3DNav). The verification experiment for 3DNav was conducted with 30 human-subjects of both genders. The experiment used the 3DNav prototype to present a pseudo-universe, where each user was required to find five objects using the multi-touch display and five objects using a game controller (GamePad). For the multi-touch display, 3DNav used a commercial library called GestureWorks in conjunction with Yield to resolve the ambiguity posed by the multiplicity of gestures reported by the initial classification. The experiment compared both devices. The task completion time with multi-touch was slightly shorter, but the difference was not statistically significant. The design of experiment also included an equation that determined the level of video game console expertise of the subjects, which was used to break down users into two groups: casual users and experienced users. The study found that experienced gamers performed significantly faster with the GamePad than casual users. When looking at the groups separately, casual gamers performed significantly better using the multi-touch display, compared to the GamePad. Additional results are found in this dissertation.