921 resultados para Standard method
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Radiofrequency neurotomy is a recognized treatment for cervical zygapophysial joint pain. In several studies, the method has provided complete pain relief in 60-70% of the patients for approximately 9 months. The validated technique has the disadvantage of procedural times of 2-4 hours because several lesions are performed to take into account the variable nerve course. We tested the hypothesis that ultrasound localization of the nerves would enable us to reduce the number of lesions performed, while reaching the benchmark of at least 80% pain relief in 80% of patients with a median duration of 35 weeks, as achieved by a previous investigation using the standard method.
Resumo:
Objective Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular and for improving healthcare quality and patient safety in general. Method The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. Results The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. Conclusions Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
OBJECTIVE: Interruptions are known to have a negative impact on activity performance. Understanding how an interruption contributes to human error is limited because there is not a standard method for analyzing and classifying interruptions. Qualitative data are typically analyzed by either a deductive or an inductive method. Both methods have limitations. In this paper, a hybrid method was developed that integrates deductive and inductive methods for the categorization of activities and interruptions recorded during an ethnographic study of physicians and registered nurses in a Level One Trauma Center. Understanding the effects of interruptions is important for designing and evaluating informatics tools in particular as well as improving healthcare quality and patient safety in general. METHOD: The hybrid method was developed using a deductive a priori classification framework with the provision of adding new categories discovered inductively in the data. The inductive process utilized line-by-line coding and constant comparison as stated in Grounded Theory. RESULTS: The categories of activities and interruptions were organized into a three-tiered hierarchy of activity. Validity and reliability of the categories were tested by categorizing a medical error case external to the study. No new categories of interruptions were identified during analysis of the medical error case. CONCLUSIONS: Findings from this study provide evidence that the hybrid model of categorization is more complete than either a deductive or an inductive method alone. The hybrid method developed in this study provides the methodical support for understanding, analyzing, and managing interruptions and workflow.
Resumo:
Five test runs were performed to assess possible bias when performing the loss on ignition (LOI) method to estimate organic matter and carbonate content of lake sediments. An accurate and stable weight loss was achieved after 2 h of burning pure CaCO3 at 950 °C, whereas LOI of pure graphite at 530 °C showed a direct relation to sample size and exposure time, with only 40-70% of the possible weight loss reached after 2 h of exposure and smaller samples losing weight faster than larger ones. Experiments with a standardised lake sediment revealed a strong initial weight loss at 550 °C, but samples continued to lose weight at a slow rate at exposure of up to 64 h, which was likely the effect of loss of volatile salts, structural water of clay minerals or metal oxides, or of inorganic carbon after the initial burning of organic matter. A further test-run revealed that at 550 °C samples in the centre of the furnace lost more weight than marginal samples. At 950 °C this pattern was still apparent but the differences became negligible. Again, LOI was dependent on sample size. An analytical LOI quality control experiment including ten different laboratories was carried out using each laboratory's own LOI procedure as well as a standardised LOI procedure to analyse three different sediments. The range of LOI values between laboratories measured at 550 °C was generally larger when each laboratory used its own method than when using the standard method. This was similar for 950 °C, although the range of values tended to be smaller. The within-laboratory range of LOI measurements for a given sediment was generally small. Comparisons of the results of the individual and the standardised method suggest that there is a laboratory-specific pattern in the results, probably due to differences in laboratory equipment and/or handling that could not be eliminated by standardising the LOI procedure. Factors such as sample size, exposure time, position of samples in the furnace and the laboratory measuring affected LOI results, with LOI at 550 °C being more susceptible to these factors than LOI at 950 °C. We, therefore, recommend analysts to be consistent in the LOI method used in relation to the ignition temperatures, exposure times, and the sample size and to include information on these three parameters when referring to the method.
Resumo:
The Phase I clinical trial is considered the "first in human" study in medical research to examine the toxicity of a new agent. It determines the maximum tolerable dose (MTD) of a new agent, i.e., the highest dose in which toxicity is still acceptable. Several phase I clinical trial designs have been proposed in the past 30 years. The well known standard method, so called the 3+3 design, is widely accepted by clinicians since it is the easiest to implement and it does not need a statistical calculation. Continual reassessment method (CRM), a design uses Bayesian method, has been rising in popularity in the last two decades. Several variants of the CRM design have also been suggested in numerous statistical literatures. Rolling six is a new method introduced in pediatric oncology in 2008, which claims to shorten the trial duration as compared to the 3+3 design. The goal of the present research was to simulate clinical trials and compare these phase I clinical trial designs. Patient population was created by discrete event simulation (DES) method. The characteristics of the patients were generated by several distributions with the parameters derived from a historical phase I clinical trial data review. Patients were then selected and enrolled in clinical trials, each of which uses the 3+3 design, the rolling six, or the CRM design. Five scenarios of dose-toxicity relationship were used to compare the performance of the phase I clinical trial designs. One thousand trials were simulated per phase I clinical trial design per dose-toxicity scenario. The results showed the rolling six design was not superior to the 3+3 design in terms of trial duration. The time to trial completion was comparable between the rolling six and the 3+3 design. However, they both shorten the duration as compared to the two CRM designs. Both CRMs were superior to the 3+3 design and the rolling six in accuracy of MTD estimation. The 3+3 design and rolling six tended to assign more patients to undesired lower dose levels. The toxicities were slightly greater in the CRMs.^
Resumo:
Cross-contamination between cell lines is a longstanding and frequent cause of scientific misrepresentation. Estimates from national testing services indicate that up to 36% of cell lines are of a different origin or species to that claimed. To test a standard method of cell line authentication, 253 human cell lines from banks and research institutes worldwide were analyzed by short tandem repeat profiling. The short tandem repeat profile is a simple numerical code that is reproducible between laboratories, is inexpensive, and can provide an international reference standard for every cell line. If DNA profiling of cell lines is accepted and demanded internationally, scientific misrepresentation because of cross-contamination can be largely eliminated.
Resumo:
There are a large number of image processing applications that work with different performance requirements and available resources. Recent advances in image compression focus on reducing image size and processing time, but offer no real-time solutions for providing time/quality flexibility of the resulting image, such as using them to transmit the image contents of web pages. In this paper we propose a method for encoding still images based on the JPEG standard that allows the compression/decompression time cost and image quality to be adjusted to the needs of each application and to the bandwidth conditions of the network. The real-time control is based on a collection of adjustable parameters relating both to aspects of implementation and to the hardware with which the algorithm is processed. The proposed encoding system is evaluated in terms of compression ratio, processing delay and quality of the compressed image when compared with the standard method.
Resumo:
We apply the truncated Wigner method to the process of three-body recombination in ultracold Bose gases. We find that within the validity regime of the Wigner truncation for two-body scattering, three-body recombination can be treated using a set of coupled stochastic differential equations that include diffusion terms, and can be simulated using known numerical methods. As an example we investigate the behavior of a simple homogeneous Bose gas, finding a very slight increase of the loss rate compared to that obtained by using the standard method.
Resumo:
Absolute abundances (concentrations) of dinoflagellate cysts are often determined through the addition of Lycopodium clavatum marker-grains as a spike to a sample before palynological processing. An inter-laboratory calibration exercise was set up in order to test the comparability of results obtained in different laboratories, each using its own preparation method. Each of the 23 laboratories received the same amount of homogenized splits of four Quaternary sediment samples. The samples originate from different localities and consisted of a variety of lithologies. Dinoflagellate cysts were extracted and counted, and relative and absolute abundances were calculated. The relative abundances proved to be fairly reproducible, notwithstanding a need for taxonomic calibration. By contrast, excessive loss of Lycopodium spores during sample preparation resulted in non-reproducibility of absolute abundances. Use of oxidation, KOH, warm acids, acetolysis, mesh sizes larger than 15 µm and long ultrasonication (> 1 min) must be avoided to determine reproducible absolute abundances. The results of this work therefore indicate that the dinoflagellate cyst worker should make a choice between using the proposed standard method which circumvents critical steps, adding Lycopodium tablets at the end of the preparation and using an alternative method.
Resumo:
Several protocols for isolation of mycobacteria from water exist, but there is no established standard method. This study compared methods of processing potable water samples for the isolation of Mycobacterium avium and Mycobacterium intracellulare using spiked sterilized water and tap water decontaminated using 0.005% cetylpyridinium chloride (CPC). Samples were concentrated by centrifugation or filtration and inoculated onto Middlebrook 7H10 and 7H11 plates and Lowenstein-Jensen slants and into mycobacterial growth indicator tubes with or without polymyxin, azlocillin, nalidixic acid, trimethoprim, and amphotericin B. The solid media were incubated at 32°C, at 35°C, and at 35°C with CO2 and read weekly. The results suggest that filtration of water for the isolation of mycobacteria is a more sensitive method for concentration than centrifugation. The addition of sodium thiosulfate may not be necessary and may reduce the yield. Middlebrook M7H10 and 7H11 were equally sensitive culture media. CPC decontamination, while effective for reducing growth of contaminants, also significantly reduces mycobacterial numbers. There was no difference at 3 weeks between the different incubation temperatures.
Resumo:
Purpose. The objective of this study was to explore the discriminative capacity of non-contact corneal esthesiometry (NCCE) when compared with the neuropathy disability score (NDS) score—a validated, standard method of diagnosing clinically significant diabetic neuropathy. Methods. Eighty-one participants with type 2 diabetes, no history of ocular disease, trauma, or surgery and no history of systemic disease that may affect the cornea were enrolled. Participants were ineligible if there was history of neuropathy due to non-diabetic cause or current diabetic foot ulcer or infection. Corneal sensitivity threshold was measured on the eye of dominant hand side at a distance of 10 mm from the center of the cornea using a stimulus duration of 0.9 s. The NDS was measured producing a score ranging from 0 to 10. To determine the optimal cutoff point of corneal sensitivity that identified the presence of neuropathy (diagnosed by NDS), the Youden index and “closest-to-(0,1)” criteria were used. Results. The receiver-operator characteristic curve for NCCE for the presence of neuropathy (NDS ≥3) had an area under the curve of 0.73 (p = 0.001) and, for the presence of moderate neuropathy (NDS ≥6), area of 0.71 (p = 0.003). By using the Youden index, for an NDS ≥3, the sensitivity of NCCE was 70% and specificity was 75%, and a corneal sensitivity threshold of 0.66 mbar or higher indicated the presence of neuropathy. When NDS ≥6 (indicating risk of foot ulceration) was applied, the sensitivity was 52% with a specificity of 85%. Conclusions. NCCE is a sensitive test for the diagnosis of minimal and more advanced diabetic neuropathy and may serve as a useful surrogate marker for diabetic and perhaps other neuropathies.
Resumo:
The gold standard method for detecting chlamydial infection in domestic and wild animals is PCR, but the technique is not suited to testing animals in the field when a rapid diagnosis is frequently required. The objective of this study was to compare the results of a commercially available enzyme immunoassay test for Chlamydia against a quantitative Chlamydia pecorum-specific PCR performed on swabs collected from the conjunctival sac, nasal cavity and urogenital sinuses of naturally infected koalas (Phascolarctos cinereus). The level of agreement for positive results between the two assays was low (43.2%). The immunoassay detection cut-off was determined as approximately 400 C. pecorum copies, indicating that the test was sufficiently sensitive to be used for the rapid diagnosis of active chlamydial infections.
Resumo:
The method of lines is a standard method for advancing the solution of partial differential equations (PDEs) in time. In one sense, the method applies equally well to space-fractional PDEs as it does to integer-order PDEs. However, there is a significant challenge when solving space-fractional PDEs in this way, owing to the non-local nature of the fractional derivatives. Each equation in the resulting semi-discrete system involves contributions from every spatial node in the domain. This has important consequences for the efficiency of the numerical solver, especially when the system is large. First, the Jacobian matrix of the system is dense, and hence methods that avoid the need to form and factorise this matrix are preferred. Second, since the cost of evaluating the discrete equations is high, it is essential to minimise the number of evaluations required to advance the solution in time. In this paper, we show how an effective preconditioner is essential for improving the efficiency of the method of lines for solving a quite general two-sided, nonlinear space-fractional diffusion equation. A key contribution is to show, how to construct suitable banded approximations to the system Jacobian for preconditioning purposes that permit high orders and large stepsizes to be used in the temporal integration, without requiring dense matrices to be formed. The results of numerical experiments are presented that demonstrate the effectiveness of this approach.
Resumo:
Household air pollution (HAP), arising mainly from the combustion of solid and other polluting fuels, is responsible for a very substantial public health burden, most recently estimated as causing 3.5 million premature deaths in 2010. These patterns of household fuel use have also important negative impacts on safety, prospects for poverty reduction and the environment, including climate change. Building on previous air quality guidelines, the WHO is developing new guidelines focused on household fuel combustion, covering cooking, heating and lighting, and although global, the key focus is low and middle income countries reflecting the distribution of disease burden. As discussed in this paper, currently in development, the guidelines will include reviews of a wide range of evidence including fuel use in homes, emissions from stoves and lighting, household air pollution and exposure levels experienced by populations, health risks, impacts of interventions on HAP and exposure, and also key factors influencing sustainable and equitable adoption of improved stoves and cleaner fuels. GRADE, the standard method used for guidelines evidence review may not be well suited to the variety and nature of evidence required for this project, and a modified approach is being developed and tested. Work on the guidelines is being carried out in close collaboration with the UN Foundation Global Alliance on Clean cookstoves, allowing alignment with specific tools including recently developed international voluntary standards for stoves, and the development of country action plans. Following publication, WHO plans to work closely with a number of countries to learn from implementation efforts, in order to further strengthen support and guidance. A case study on the situation and policy actions to date in Bhutan provide an illustration of the challenges and opportunities involved, and the timely importance of the new guidelines and associated research, evaluation and policy development agendas.