49 resultados para Patient-generated outccome measures
em Cambridge University Engineering Department Publications Database
Resumo:
BACKGROUND: After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. OUTCOME MEASURES: The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. RESULTS: Use of the GO-ARC technique was associated with improvement on all measures. CONCLUSIONS: While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls.
Resumo:
An understanding of within-host dynamics of pathogen interactions with eukaryotic cells can shape the development of effective preventive measures and drug regimes. Such investigations have been hampered by the difficulty of identifying and observing directly, within live tissues, the multiple key variables that underlay infection processes. Fluorescence microscopy data on intracellular distributions of Salmonella enterica serovar Typhimurium (S. Typhimurium) show that, while the number of infected cells increases with time, the distribution of bacteria between cells is stationary (though highly skewed). Here, we report a simple model framework for the intensity of intracellular infection that links the quasi-stationary distribution of bacteria to bacterial and cellular demography. This enables us to reject the hypothesis that the skewed distribution is generated by intrinsic cellular heterogeneities, and to derive specific predictions on the within-cell dynamics of Salmonella division and host-cell lysis. For within-cell pathogens in general, we show that within-cell dynamics have implications across pathogen dynamics, evolution, and control, and we develop novel generic guidelines for the design of antibacterial combination therapies and the management of antibiotic resistance.
Resumo:
An understanding of within-host dynamics of pathogen interactions with eukaryotic cells can shape the development of effective preventive measures and drug regimes. Such investigations have been hampered by the difficulty of identifying and observing directly, within live tissues, the multiple key variables that underlay infection processes. Fluorescence microscopy data on intracellular distributions of Salmonella enterica serovar Typhimurium (S. Typhimurium) show that, while the number of infected cells increases with time, the distribution of bacteria between cells is stationary (though highly skewed). Here, we report a simple model framework for the intensity of intracellular infection that links the quasi-stationary distribution of bacteria to bacterial and cellular demography. This enables us to reject the hypothesis that the skewed distribution is generated by intrinsic cellular heterogeneities, and to derive specific predictions on the within-cell dynamics of Salmonella division and host-cell lysis. For within-cell pathogens in general, we show that within-cell dynamics have implications across pathogen dynamics, evolution, and control, and we develop novel generic guidelines for the design of antibacterial combination therapies and the management of antibiotic resistance.
Resumo:
We present a method of rapidly producing computer-generated holograms that exhibit geometric occlusion in the reconstructed image. Conceptually, a bundle of rays is shot from every hologram sample into the object volume.We use z buffering to find the nearest intersecting object point for every ray and add its complex field contribution to the corresponding hologram sample. Each hologram sample belongs to an independent operation, allowing us to exploit the parallel computing capability of modern programmable graphics processing units (GPUs). Unlike algorithms that use points or planar segments as the basis for constructing the hologram, our algorithm's complexity is dependent on fixed system parameters, such as the number of ray-casting operations, and can therefore handle complicated models more efficiently. The finite number of hologram pixels is, in effect, a windowing function, and from analyzing the Wigner distribution function of windowed free-space transfer function we find an upper limit on the cone angle of the ray bundle. Experimentally, we found that an angular sampling distance of 0:01' for a 2:66' cone angle produces acceptable reconstruction quality. © 2009 Optical Society of America.
Resumo:
Computer generated holography is an extremely demanding and complex task when it comes to providing realistic reconstructions with full parallax, occlusion, and shadowing. We present an algorithm designed for data-parallel computing on modern graphics processing units to alleviate the computational burden. We apply Gaussian interpolation to create a continuous surface representation from discrete input object points. The algorithm maintains a potential occluder list for each individual hologram plane sample to keep the number of visibility tests to a minimum.We experimented with two approximations that simplify and accelerate occlusion computation. It is observed that letting several neighboring hologramplane samples share visibility information on object points leads to significantly faster computation without causing noticeable artifacts in the reconstructed images. Computing a reduced sample set via nonuniform sampling is also found to be an effective acceleration technique. © 2009 Optical Society of America.
Resumo:
Road damage due to heavy vehicles is thought to be dependent on the extent to which lorries in normal traffic apply peak forces to the same locations along the road. A validated vehicle simulation is used to simulate 37 leaf-sprung articulated vehicles with parametric variations typical of vehicles in one weight class in the highway vehicle fleet. The spatial distribution of tyre forces generated by each vehicle is compared with the distribution generated by a reference vehicle, and the conditions are established for which repeated heavy loading occurs at specific points along the road. It is estimated that approximately two-thirds of vehicles in this class (a large proportion of all heavy vehicles) may contribute to a repeated pattern of road loading. It is concluded that dynamic tyre forces are a significant factor influencing road damage, compared to other factors such as tyre configuration and axle spacing.