409 resultados para Experimental Diagnosis
Resumo:
Soil C decomposition is sensitive to changes in temperature, and even small increases in temperature may prompt large releases of C from soils. But much of what we know about soil C responses to global change is based on short-term incubation data and model output that implicitly assumes soil C pools are composed of organic matter fractions with uniform temperature sensitivities. In contrast, kinetic theory based on chemical reactions suggests that older, more-resistant C fractions may be more temperature sensitive. Recent research on the subject is inconclusive, indicating that the temperature sensitivity of labile soil organic matter (OM) decomposition could either be greater than, less than, or equivalent to that of resistant soil OM. We incubated soils at constant temperature to deplete them of labile soil OM and then successively assessed the CO2-C efflux in response to warming. We found that the decomposition response to experimental warming early during soil incubation (when more labile C remained) was less than that later when labile C was depleted. These results suggest that the temperature sensitivity of resistant soil OM pools is greater than that for labile soil OM and that global change-driven soil C losses may be greater than previously estimated.
Resumo:
Identifying crash “hotspots”, “blackspots”, “sites with promise”, or “high risk” locations is standard practice in departments of transportation throughout the US. The literature is replete with the development and discussion of statistical methods for hotspot identification (HSID). Theoretical derivations and empirical studies have been used to weigh the benefits of various HSID methods; however, a small number of studies have used controlled experiments to systematically assess various methods. Using experimentally derived simulated data—which are argued to be superior to empirical data, three hot spot identification methods observed in practice are evaluated: simple ranking, confidence interval, and Empirical Bayes. Using simulated data, sites with promise are known a priori, in contrast to empirical data where high risk sites are not known for certain. To conduct the evaluation, properties of observed crash data are used to generate simulated crash frequency distributions at hypothetical sites. A variety of factors is manipulated to simulate a host of ‘real world’ conditions. Various levels of confidence are explored, and false positives (identifying a safe site as high risk) and false negatives (identifying a high risk site as safe) are compared across methods. Finally, the effects of crash history duration in the three HSID approaches are assessed. The results illustrate that the Empirical Bayes technique significantly outperforms ranking and confidence interval techniques (with certain caveats). As found by others, false positives and negatives are inversely related. Three years of crash history appears, in general, to provide an appropriate crash history duration.
Resumo:
Diabetic peripheral neuropathy (DPN) is one of the most debilitating complications of diabetes. DPN is a major cause of foot ulceration and lower limb amputation. Early diagnosis and management is a key factor in reducing morbidity and mortality. Current techniques for clinical assessment of DPN are relatively insensitive for detecting early disease or involve invasive procedures such as skin biopsies. There is a need for less painful, non-invasive and safe evaluation methods. Eye care professionals already play an important role in the management of diabetic retinopathy; however recent studies have indicated that the eye may also be an important site for the diagnosis and monitoring of neuropathy. Corneal nerve morphology has been shown to be a promising marker of diabetic neuropathy occurring elsewhere in the body, and emerging evidence tentatively suggests that retinal anatomical markers and a range of functional visual indicators could similarly provide useful information regarding neural damage in diabetes – although this line of research is, as yet, less well established. This review outlines the growing body of evidence supporting a potential diagnostic role for retinal structure and visual functional markers in the diagnosis and monitoring of peripheral neuropathy in diabetes.
Resumo:
Modelling of water flow and associated deformation in unsaturated reactive soils (shrinking/swelling soils) is important in many applications. The current paper presents a method to capture soil swelling deformation during water infiltration using Particle Image Velocimetry (PIV). The model soil material used is a commercially available bentonite. A swelling chamber was setup to determine the water content profile and extent of soil swelling. The test was run for 61 days, and during this time period, the soil underwent on average across its width swelling of about 26% of the height of the soil column. PIV analysis was able to determine the amount of swelling that occurred within the entire face of the soil box that was used for observations. The swelling was most apparent in the top layers with strains in most cases over 100%.
Resumo:
Shaft-mounted gearboxes are widely used in industry. The torque arm that holds the reactive torque on the housing of the gearbox, if properly positioned creates the reactive force that lifts the gearbox and unloads the bearings of the output shaft. The shortcoming of these torque arms is that if the gearbox is reversed the direction of the reactive force on the torque arm changes to opposite and added to the weight of the gearbox overloads the bearings shortening their operating life. In this paper, a new patented design of torque arms that develop a controlled lifting force and counteract the weight of the gearbox regardless of the direction of the output shaft rotation is described. Several mathematical models of the conventional and new torque arms were developed and verified experimentally on a specially built test rig that enables modelling of the radial compliance of the gearbox bearings and elastic elements of the torque arms. Comparison showed a good agreement between theoretical and experimental results.
Resumo:
Purpose: The aim of this study was to determine current approaches adopted by optometrists to the recording of corneal staining following fluorescein instillation. Methods: An anonymous ‘record-keeping task’ was sent to all 756 practitioners who are members of the Queensland Division of Optometrists Association Australia. This task comprised a form on which appeared a colour photograph depicting contact lens solution-induced corneal staining. Next to the photograph was an empty box, in which practitioners were asked to record their observations. Practitioners were also asked to indicate the level of severity of the condition at which treatment would be instigated. Results: Completed task forms were returned by 228 optometrists, representing a 30 per cent response rate. Ninety-two per cent of respondents offered a diagnosis. The most commonly used descriptive terms were ‘superficial punctate keratitis’ (36 per cent of respondents) and ‘punctate staining’ (29 per cent). The level of severity and location of corneal staining were noted by 69 and 68 per cent of respondents, respectively. A numerical grade was assigned by 44 per cent of respondents. Only three per cent nominated the grading scale used. The standard deviation of assigned grades was � 0.6. The condition was sketched by 35 per cent of respondents and two per cent stated that they would take a photograph of the eye. Ten per cent noted the eye in which the condition was being observed. Opinions of the level of severity at which treatment for corneal staining should be instigated varied considerably between practitioners, ranging from ‘any sign of corneal staining’ to ‘grade 4 staining’. Conclusion: Although most practitioners made a sensible note of the condition and properly recorded the location of corneal staining, serious deficiencies were evident regarding other aspects of record-keeping. Ongoing programs of professional optometric education should reinforce good practice in relation to clinical record-keeping.
Resumo:
Two-stroke outboard boat engines using total loss lubrication deposit a significant proportion of their lubricant and fuel directly into the water. The purpose of this work is to document the velocity and concentration field characteristics of a submerged swirling water jet emanating from a propeller in order to provide information on its fundamental characteristics. The properties of the jet were examined far enough downstream to be relevant to the eventual modelling of the mixing problem. Measurements of the velocity and concentration field were performed in a turbulent jet generated by a model boat propeller (0.02 m diameter) operating at 1500 rpm and 3000 rpm in a weak co-flow of 0.04 m/s. The measurements were carried out in the Zone of Established Flow up to 50 propeller diameters downstream of the propeller, which was placed in a glass-walled flume 0.4 m wide with a free surface depth of 0.15 m. The jet and scalar plume development were compared to that of a classical free round jet. Further, results pertaining to radial distribution, self similarity, standard deviation growth, maximum value decay and integral fluxes of velocity and concentration were presented and fitted with empirical correlations. Furthermore, propeller induced mixing and pollutant source concentration from a two-stroke engine were estimated.
Comparison of standard image segmentation methods for segmentation of brain tumors from 2D MR images
Resumo:
In the analysis of medical images for computer-aided diagnosis and therapy, segmentation is often required as a preliminary step. Medical image segmentation is a complex and challenging task due to the complex nature of the images. The brain has a particularly complicated structure and its precise segmentation is very important for detecting tumors, edema, and necrotic tissues in order to prescribe appropriate therapy. Magnetic Resonance Imaging is an important diagnostic imaging technique utilized for early detection of abnormal changes in tissues and organs. It possesses good contrast resolution for different tissues and is, thus, preferred over Computerized Tomography for brain study. Therefore, the majority of research in medical image segmentation concerns MR images. As the core juncture of this research a set of MR images have been segmented using standard image segmentation techniques to isolate a brain tumor from the other regions of the brain. Subsequently the resultant images from the different segmentation techniques were compared with each other and analyzed by professional radiologists to find the segmentation technique which is the most accurate. Experimental results show that the Otsu’s thresholding method is the most suitable image segmentation method to segment a brain tumor from a Magnetic Resonance Image.
Resumo:
In silico experimental modeling of cancer involves combining findings from biological literature with computer-based models of biological systems in order to conduct investigations of hypotheses entirely in the computer laboratory. In this paper, we discuss the use of in silico modeling as a precursor to traditional clinical and laboratory research, allowing researchers to refine their experimental programs with an aim to reducing costs and increasing research efficiency. We explain the methodology of in silico experimental trials before providing an example of in silico modeling from the biomathematical literature with a view to promoting more widespread use and understanding of this research strategy.
Resumo:
Experiments were undertaken to study effect of initial conditions on the expansion ratio of two grains in a laboratory scale, single speed, single screw extruder at Naresuan University, Thailand. Jasmine rice and Mung bean were used as the material. Three different initial moisture contents were adjusted for the grains and classified them into three groups according to particle sizes. Mesh sizes used are 12 and 14. Expansion ratio was measured at a constant barrel temperature of 190oC. Response surface methodology was used to obtain optimum conditions between moisture content and particle size of the materials concerned.