985 resultados para Partial oxalate method
Resumo:
Projecte de recerca elaborat a partir d’una estada a la University of Calgary, Canadà, entre desembre del 2007 i febrer del 2008. El projecte ha consistit en l'anàlisi de les dades d'una recerca en el camp de la psicologia de la música, concretament en com influeix la música en l'atenció a través de les vies dels estats emocionals i enèrgics de la persona. Per a la recerca es feu ús de videu en les sessions, obtenint dades visuals i auditives per a complementar les dades de tipus quantitatiu provinents dels resultats d'uns tests d'atenció subministrats. L'anàlisi es realitzà segons mètodes i tècniques de caràcter qualitatiu, apresos durant l'estada. Així mateix també s'ha aprofundit en la comprensió del paradigma qualitatiu com a paradigma vàlid i realment complementari del paradigma qualitatiu. S'ha focalitzat especialment en l'anàlisi de la conversa des d'un punt de vista interpretatiu així com l'anàlisi de llenguatge corporal i facial a partir de l'observació de videu, tot formulant descriptors i subdescriptors de la conducta que està relacionada amb la hipòtesis. Alguns descriptors havien estat formulats prèviament a l’anàlisi, en base a altres investigacions i al background de la investigadora; altres s’han anat descobrint durant l’anàlisi. Els descriptors i subdescriptors de la conducta estan relacionats amb l'intent dels estats anímics i enèrgics dels diferents participants. L'anàlisi s'ha realitzat com un estudi de casos, fent un anàlisi exhaustiu persona per persona amb l'objectiu de trobar patrons de reacció intrapersonals i intrapersonals. Els patrons observats s'utilitzaran com a contrast amb la informació quantitativa, tot realitzant triangulació amb les dades per trobar-ne possibles recolzaments o contradiccions entre sí. Els resultats preliminars indiquen relació entre el tipus de música i el comportament, sent que la música d'emotivitat negativa està associada a un tancament de la persona, però quan la música és enèrgica els participants s'activen (conductualment observat) i somriuen si aquesta és positiva.
Resumo:
We present a novel hybrid (or multiphysics) algorithm, which couples pore-scale and Darcy descriptions of two-phase flow in porous media. The flow at the pore-scale is described by the Navier?Stokes equations, and the Volume of Fluid (VOF) method is used to model the evolution of the fluid?fluid interface. An extension of the Multiscale Finite Volume (MsFV) method is employed to construct the Darcy-scale problem. First, a set of local interpolators for pressure and velocity is constructed by solving the Navier?Stokes equations; then, a coarse mass-conservation problem is constructed by averaging the pore-scale velocity over the cells of a coarse grid, which act as control volumes; finally, a conservative pore-scale velocity field is reconstructed and used to advect the fluid?fluid interface. The method relies on the localization assumptions used to compute the interpolators (which are quite straightforward extensions of the standard MsFV) and on the postulate that the coarse-scale fluxes are proportional to the coarse-pressure differences. By numerical simulations of two-phase problems, we demonstrate that these assumptions provide hybrid solutions that are in good agreement with reference pore-scale solutions and are able to model the transition from stable to unstable flow regimes. Our hybrid method can naturally take advantage of several adaptive strategies and allows considering pore-scale fluxes only in some regions, while Darcy fluxes are used in the rest of the domain. Moreover, since the method relies on the assumption that the relationship between coarse-scale fluxes and pressure differences is local, it can be used as a numerical tool to investigate the limits of validity of Darcy's law and to understand the link between pore-scale quantities and their corresponding Darcy-scale variables.
Partial cricotracheal resection for pediatric subglottic stenosis: long-term outcome in 57 patients.
Resumo:
OBJECTIVE: We sought to assess the long-term outcome of 57 pediatric patients who underwent partial cricotracheal resection for subglottic stenosis. METHODS: Eighty-one pediatric partial cricotracheal resections were performed in our tertiary care institution between 1978 and 2004. Fifty-seven patients had a minimal follow-up time of 1 year and were included in this study. Evaluation was based on the last laryngotracheal endoscopy, the responses to a questionnaire, and a retrospective review of the patient's data. The following parameters were analyzed: decannulation rates, breathing, voice quality, and deglutition. RESULTS: A single-stage partial cricotracheal resection was performed in 38 patients, and a double-stage procedure was performed in 19 patients. Sixteen patients underwent an extended partial cricotracheal resection (ie, partial cricotracheal resection combined with another open procedure). At a median follow-up time of 5.1 years, the decannulation rates after a single- or double-stage procedure were 97.4% and 95%, respectively. Two patients remained tracheotomy dependent. One patient had moderate exertional dyspnea, and all other patients had no exertional dyspnea. Voice quality was found to improve after surgical intervention for 1 +/- 1.34 grade dysphonia (P < .0001) according to the adapted GRBAS grading system (Grade, Roughness, Breathiness, Asthenia, and Strain). CONCLUSIONS: Partial cricotracheal resection provides good results for grades III and IV subglottic stenosis as primary or salvage operations. The procedure has no deleterious effects on laryngeal growth and function. The quality of voice significantly improves after surgical intervention but largely depends on the preoperative condition.
Resumo:
We study preconditioning techniques for discontinuous Galerkin discretizations of isotropic linear elasticity problems in primal (displacement) formulation. We propose subspace correction methods based on a splitting of the vector valued piecewise linear discontinuous finite element space, that are optimal with respect to the mesh size and the Lamé parameters. The pure displacement, the mixed and the traction free problems are discussed in detail. We present a convergence analysis of the proposed preconditioners and include numerical examples that validate the theory and assess the performance of the preconditioners.
Resumo:
iii. Catheter-related bloodstream infection (CR-BSI) diagnosis usually involves catheter withdrawal. An alternative method for CR-BSI diagnosis is the differential time to positivity (DTP) between peripheral and catheter hub blood cultures. This study aims to validate the DTP method in short-term catheters. The results show a low prevalence of CR-BSI in the sample (8.4%). The DTP method is a valid alternative for CR-BSI diagnosis in those cases with monomicrobial cultures (80% sensitivity, 99% specificity, 92% positive predictive value, and 98% negative predictive value) and a cut-off point of 17.7 hours for positivity of hub blood culture may assess in CR-BSI diagnosis.
Preretinal partial pressure of oxygen gradients before and after experimental pars plana vitrectomy.
Resumo:
PURPOSE: To evaluate preretinal partial pressure of oxygen (PO2) gradients before and after experimental pars plana vitrectomy. METHODS: Arteriolar, venous, and intervascular preretinal PO2 gradients were recorded in 7 minipigs during slow withdrawal of oxygen-sensitive microelectrodes (10-μm tip diameter) from the vitreoretinal interface to 2 mm into the vitreous cavity. Recordings were repeated after pars plana vitrectomy and balanced salt solution (BSS) intraocular perfusion. RESULTS: Arteriolar, venous, and intervascular preretinal PO2 at the vitreoretinal interface were 62.3 ± 13.8, 22.5 ± 3.3, and 17.0 ± 7.5 mmHg, respectively, before vitrectomy; 97.7 ± 19.9, 40.0 ± 21.9, and 56.3 ± 28.4 mmHg, respectively, immediately after vitrectomy; and 59.0 ± 27.4, 25.2 ± 3.0, and 21.5 ± 4.5 mmHg, respectively, 2½ hours after interruption of BSS perfusion. PO2 2 mm from the vitreoretinal interface was 28.4 ± 3.6 mmHg before vitrectomy; 151.8 ± 4.5 mmHg immediately after vitrectomy; and 34.8 ± 4.1 mmHg 2½ hours after interruption of BSS perfusion. PO2 gradients were still present after vitrectomy, with the same patterns as before vitrectomy. CONCLUSION: Preretinal PO2 gradients are not eliminated after pars plana vitrectomy. During BSS perfusion, vitreous cavity PO2 is very high. Interruption of BSS perfusion evokes progressive equilibration of vitreous cavity PO2 with concomitant progressive return of preretinal PO2 gradients to their previtrectomy patterns. This indicates that preretinal diffusion of oxygen is not altered after vitrectomy. The beneficial effect of vitrectomy in ischemic retinal diseases or macular edema may be related to other mechanisms, such as increased oxygen convection currents or removal of growth factors and cytokines secreted in the vitreous.
Resumo:
The diagnosis of Strongyloides stercoralis infections is routinely made by microscopic observation of larvae in stool samples, a low sensitivity method, or by other, most effective methods, such as the Baermann or agar culture plate methods. We propose in this paper a practical modification of Baermann method. One hundred and six stool samples from alcoholic patients were analyzed using the direct smear test, agar culture plate method, the standard Baermann method, and its proposed modification. For this modification the funnel used in the original version of the method is substituted by a test tube with a rubber stopper, perforated to allow insertion of a pipette tip. The tube with a fecal suspension is inverted over another tube containing 6 ml of saline solution and incubated at 37°C for at least 2 h. The saline solution from the second tube is centrifuged and the pellet is observed microscopically. Larva of S. stercoralis were detected in six samples (5.7%) by the two versions of the Baermann method. Five samples were positive using the agar culture plate method, and only in two samples the larva were observed using direct microscopic observation of fecal smears. Cysts of Endolimax nana and Entamoeba histolytica/dyspar were also detected in the modification of Baermann method. Data obtained by the modified Baermann method suggest that this methodology may helps concentrate larvae of S. stercoralis as efficiently as the original method.
Resumo:
A modified adsorption-elution method for the concentration of seeded rotavirus from water samples was used to determine various factors which affected the virus recovery. An enzyme-linked immunosorbent assay was used to detect the rotavirus antigen after concentration. Of the various eluents compared, 0.05M glycine, pH 11.5 gave the highest rotavirus antigen recovery using negatively charged membrane filtration whereas 2.9% tryptose phosphate broth containing 6% glycine; pH 9.0 was found to give the greatest elution efficiency when a positively charged membrane was used. Reconcentration of water samples by a speedVac concentrator showed significantly higher rotavirus recovery than polyethylene glycol precipitation through both negatively and positively charged filters (p-value <0.001). In addition, speedVac concentration using negatively charged filtration resulted in greater rotavirus recovery than that using positively charged filtration (p-value = 0.004). Thirty eight environmental water samples were collected from river, domestic sewage, canals receiving raw sewage drains, and tap water collected in containers for domestic use, all from congested areas of Bangkok. In addition, several samples of commercial drinking water were analyzed. All samples were concentrated and examined for rotavirus antigen. Coliforms and fecal coliforms (0->1,800 MPN/100 ml) were observed but rotavirus was not detected in any sample. This study suggests that the speedVac reconcentration method gives the most efficient rotavirus recovery from water samples.
Resumo:
Introduction: Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the Trabecular Bone Score (TBS) measure. TBS is a novel grey-level texture measurement reflecting bone micro-architecture based on the use of experimental variograms of 2D projection images. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis value, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. Method: The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goals of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. Results: We included 631 women: mean age 67.4±6.7 y, BMI 26.1±4.6, mean lumbar spine BMD 0.943±0.168 (T-score -1.4 SD), TBS 1.271±0.103. As expected, correlation between BMD and site matched TBS is low (r2=0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2- 2.5), 1.6 (1.2-2.1), 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < -2.5 SD or a TBS < 1.200. If we combine a BMD < -2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. Conclusion: As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been miss-classified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS & HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.
Resumo:
Abstract : In the subject of fingerprints, the rise of computers tools made it possible to create powerful automated search algorithms. These algorithms allow, inter alia, to compare a fingermark to a fingerprint database and therefore to establish a link between the mark and a known source. With the growth of the capacities of these systems and of data storage, as well as increasing collaboration between police services on the international level, the size of these databases increases. The current challenge for the field of fingerprint identification consists of the growth of these databases, which makes it possible to find impressions that are very similar but coming from distinct fingers. However and simultaneously, this data and these systems allow a description of the variability between different impressions from a same finger and between impressions from different fingers. This statistical description of the withinand between-finger variabilities computed on the basis of minutiae and their relative positions can then be utilized in a statistical approach to interpretation. The computation of a likelihood ratio, employing simultaneously the comparison between the mark and the print of the case, the within-variability of the suspects' finger and the between-variability of the mark with respect to a database, can then be based on representative data. Thus, these data allow an evaluation which may be more detailed than that obtained by the application of rules established long before the advent of these large databases or by the specialists experience. The goal of the present thesis is to evaluate likelihood ratios, computed based on the scores of an automated fingerprint identification system when the source of the tested and compared marks is known. These ratios must support the hypothesis which it is known to be true. Moreover, they should support this hypothesis more and more strongly with the addition of information in the form of additional minutiae. For the modeling of within- and between-variability, the necessary data were defined, and acquired for one finger of a first donor, and two fingers of a second donor. The database used for between-variability includes approximately 600000 inked prints. The minimal number of observations necessary for a robust estimation was determined for the two distributions used. Factors which influence these distributions were also analyzed: the number of minutiae included in the configuration and the configuration as such for both distributions, as well as the finger number and the general pattern for between-variability, and the orientation of the minutiae for within-variability. In the present study, the only factor for which no influence has been shown is the orientation of minutiae The results show that the likelihood ratios resulting from the use of the scores of an AFIS can be used for evaluation. Relatively low rates of likelihood ratios supporting the hypothesis known to be false have been obtained. The maximum rate of likelihood ratios supporting the hypothesis that the two impressions were left by the same finger when the impressions came from different fingers obtained is of 5.2 %, for a configuration of 6 minutiae. When a 7th then an 8th minutia are added, this rate lowers to 3.2 %, then to 0.8 %. In parallel, for these same configurations, the likelihood ratios obtained are on average of the order of 100,1000, and 10000 for 6,7 and 8 minutiae when the two impressions come from the same finger. These likelihood ratios can therefore be an important aid for decision making. Both positive evolutions linked to the addition of minutiae (a drop in the rates of likelihood ratios which can lead to an erroneous decision and an increase in the value of the likelihood ratio) were observed in a systematic way within the framework of the study. Approximations based on 3 scores for within-variability and on 10 scores for between-variability were found, and showed satisfactory results. Résumé : Dans le domaine des empreintes digitales, l'essor des outils informatisés a permis de créer de puissants algorithmes de recherche automatique. Ces algorithmes permettent, entre autres, de comparer une trace à une banque de données d'empreintes digitales de source connue. Ainsi, le lien entre la trace et l'une de ces sources peut être établi. Avec la croissance des capacités de ces systèmes, des potentiels de stockage de données, ainsi qu'avec une collaboration accrue au niveau international entre les services de police, la taille des banques de données augmente. Le défi actuel pour le domaine de l'identification par empreintes digitales consiste en la croissance de ces banques de données, qui peut permettre de trouver des impressions très similaires mais provenant de doigts distincts. Toutefois et simultanément, ces données et ces systèmes permettent une description des variabilités entre différentes appositions d'un même doigt, et entre les appositions de différents doigts, basées sur des larges quantités de données. Cette description statistique de l'intra- et de l'intervariabilité calculée à partir des minuties et de leurs positions relatives va s'insérer dans une approche d'interprétation probabiliste. Le calcul d'un rapport de vraisemblance, qui fait intervenir simultanément la comparaison entre la trace et l'empreinte du cas, ainsi que l'intravariabilité du doigt du suspect et l'intervariabilité de la trace par rapport à une banque de données, peut alors se baser sur des jeux de données représentatifs. Ainsi, ces données permettent d'aboutir à une évaluation beaucoup plus fine que celle obtenue par l'application de règles établies bien avant l'avènement de ces grandes banques ou par la seule expérience du spécialiste. L'objectif de la présente thèse est d'évaluer des rapports de vraisemblance calcul és à partir des scores d'un système automatique lorsqu'on connaît la source des traces testées et comparées. Ces rapports doivent soutenir l'hypothèse dont il est connu qu'elle est vraie. De plus, ils devraient soutenir de plus en plus fortement cette hypothèse avec l'ajout d'information sous la forme de minuties additionnelles. Pour la modélisation de l'intra- et l'intervariabilité, les données nécessaires ont été définies, et acquises pour un doigt d'un premier donneur, et deux doigts d'un second donneur. La banque de données utilisée pour l'intervariabilité inclut environ 600000 empreintes encrées. Le nombre minimal d'observations nécessaire pour une estimation robuste a été déterminé pour les deux distributions utilisées. Des facteurs qui influencent ces distributions ont, par la suite, été analysés: le nombre de minuties inclus dans la configuration et la configuration en tant que telle pour les deux distributions, ainsi que le numéro du doigt et le dessin général pour l'intervariabilité, et la orientation des minuties pour l'intravariabilité. Parmi tous ces facteurs, l'orientation des minuties est le seul dont une influence n'a pas été démontrée dans la présente étude. Les résultats montrent que les rapports de vraisemblance issus de l'utilisation des scores de l'AFIS peuvent être utilisés à des fins évaluatifs. Des taux de rapports de vraisemblance relativement bas soutiennent l'hypothèse que l'on sait fausse. Le taux maximal de rapports de vraisemblance soutenant l'hypothèse que les deux impressions aient été laissées par le même doigt alors qu'en réalité les impressions viennent de doigts différents obtenu est de 5.2%, pour une configuration de 6 minuties. Lorsqu'une 7ème puis une 8ème minutie sont ajoutées, ce taux baisse d'abord à 3.2%, puis à 0.8%. Parallèlement, pour ces mêmes configurations, les rapports de vraisemblance sont en moyenne de l'ordre de 100, 1000, et 10000 pour 6, 7 et 8 minuties lorsque les deux impressions proviennent du même doigt. Ces rapports de vraisemblance peuvent donc apporter un soutien important à la prise de décision. Les deux évolutions positives liées à l'ajout de minuties (baisse des taux qui peuvent amener à une décision erronée et augmentation de la valeur du rapport de vraisemblance) ont été observées de façon systématique dans le cadre de l'étude. Des approximations basées sur 3 scores pour l'intravariabilité et sur 10 scores pour l'intervariabilité ont été trouvées, et ont montré des résultats satisfaisants.
Resumo:
Whole-body (WB) planar imaging has long been one of the staple methods of dosimetry, and its quantification has been formalized by the MIRD Committee in pamphlet no 16. One of the issues not specifically addressed in the formalism occurs when the count rates reaching the detector are sufficiently high to result in camera count saturation. Camera dead-time effects have been extensively studied, but all of the developed correction methods assume static acquisitions. However, during WB planar (sweep) imaging, a variable amount of imaged activity exists in the detector's field of view as a function of time and therefore the camera saturation is time dependent. A new time-dependent algorithm was developed to correct for dead-time effects during WB planar acquisitions that accounts for relative motion between detector heads and imaged object. Static camera dead-time parameters were acquired by imaging decaying activity in a phantom and obtaining a saturation curve. Using these parameters, an iterative algorithm akin to Newton's method was developed, which takes into account the variable count rate seen by the detector as a function of time. The algorithm was tested on simulated data as well as on a whole-body scan of high activity Samarium-153 in an ellipsoid phantom. A complete set of parameters from unsaturated phantom data necessary for count rate to activity conversion was also obtained, including build-up and attenuation coefficients, in order to convert corrected count rate values to activity. The algorithm proved successful in accounting for motion- and time-dependent saturation effects in both the simulated and measured data and converged to any desired degree of precision. The clearance half-life calculated from the ellipsoid phantom data was calculated to be 45.1 h after dead-time correction and 51.4 h with no correction; the physical decay half-life of Samarium-153 is 46.3 h. Accurate WB planar dosimetry of high activities relies on successfully compensating for camera saturation which takes into account the variable activity in the field of view, i.e. time-dependent dead-time effects. The algorithm presented here accomplishes this task.
Resumo:
A new technique for fixation of Biomphalaria glabrata for histologic studies is described. It consists in performing several external holes in the shell, before placing the entire snail into the fixative. It is a very practical and quick procedure that showed excellent results when compared to the usual techniques.
Resumo:
The aim of this study was to investigate the correlation between proportion method with mycobacteria growth indicator tube (MGIT) and E-test for Mycobacterium tuberculosis. Forty clinical isolates were tested. MGIT and E-test with the first line antituberculous drugs correlated with the proportion method. Our results suggested that MGIT and E-test methods can be routinely used instead of the proportion method.
Resumo:
n this paper the iterative MSFV method is extended to include the sequential implicit simulation of time dependent problems involving the solution of a system of pressure-saturation equations. To control numerical errors in simulation results, an error estimate, based on the residual of the MSFV approximate pressure field, is introduced. In the initial time steps in simulation iterations are employed until a specified accuracy in pressure is achieved. This initial solution is then used to improve the localization assumption at later time steps. Additional iterations in pressure solution are employed only when the pressure residual becomes larger than a specified threshold value. Efficiency of the strategy and the error control criteria are numerically investigated. This paper also shows that it is possible to derive an a-priori estimate and control based on the allowed pressure-equation residual to guarantee the desired accuracy in saturation calculation.
Resumo:
The two main alternative methods used to identify key sectors within the input-output approach, the Classical Multiplier method (CMM) and the Hypothetical Extraction method (HEM), are formally and empirically compared in this paper. Our findings indicate that the main distinction between the two approaches stems from the role of the internal effects. These internal effects are quantified under the CMM while under the HEM only external impacts are considered. In our comparison, we find, however that CMM backward measures are more influenced by within-block effects than the proposed forward indices under this approach. The conclusions of this comparison allow us to develop a hybrid proposal that combines these two existing approaches. This hybrid model has the advantage of making it possible to distinguish and disaggregate external effects from those that a purely internal. This proposal has also an additional interest in terms of policy implications. Indeed, the hybrid approach may provide useful information for the design of ''second best'' stimulus policies that aim at a more balanced perspective between overall economy-wide impacts and their sectoral distribution.