972 resultados para minimal occlusive volume technique
Resumo:
This paper uses dynamic impulse response analysis to investigate the interrelationships among stock price volatility, trading volume, and the leverage effect. Dynamic impulse response analysis is a technique for analyzing the multi-step-ahead characteristics of a nonparametric estimate of the one-step conditional density of a strictly stationary process. The technique is the generalization to a nonlinear process of Sims-style impulse response analysis for linear models. In this paper, we refine the technique and apply it to a long panel of daily observations on the price and trading volume of four stocks actively traded on the NYSE: Boeing, Coca-Cola, IBM, and MMM.
Resumo:
Measuring the entorhinal cortex (ERC) is challenging due to lateral border discrimination from the perirhinal cortex. From a sample of 39 nondemented older adults who completed volumetric image scans and verbal memory indices, we examined reliability and validity concerns for three ERC protocols with different lateral boundary guidelines (i.e., Goncharova, Dickerson, Stoub, & deToledo-Morrell, 2001; Honeycutt et al., 1998; Insausti et al., 1998). We used three novice raters to assess inter-rater reliability on a subset of scans (216 total ERCs), with the entire dataset measured by one rater with strong intra-rater reliability on each technique (234 total ERCs). We found moderate to strong inter-rater reliability for two techniques with consistent ERC lateral boundary endpoints (Goncharova, Honeycutt), with negligible to moderate reliability for the technique requiring consideration of collateral sulcal depth (Insausti). Left ERC and story memory associations were moderate and positive for two techniques designed to exclude the perirhinal cortex (Insausti, Goncharova), with the Insausti technique continuing to explain 10% of memory score variance after additionally controlling for depression symptom severity. Right ERC-story memory associations were nonexistent after excluding an outlier. Researchers are encouraged to consider challenges of rater training for ERC techniques and how lateral boundary endpoints may impact structure-function associations.
Resumo:
Telecentric optical computed tomography (optical-CT) is a state-of-the-art method for visualizing and quantifying 3-dimensional dose distributions in radiochromic dosimeters. In this work a prototype telecentric system (DFOS-Duke Fresnel Optical-CT Scanner) is evaluated which incorporates two substantial design changes: the use of Fresnel lenses (reducing lens costs from $10-30K t0 $1-3K) and the use of a 'solid tank' (which reduces noise, and the volume of refractively matched fluid from 1 ltr to 10 cc). The efficacy of DFOS was evaluated by direct comparison against commissioned scanners in our lab. Measured dose distributions from all systems were compared against the predicted dose distributions from a commissioned treatment planning system (TPS). Three treatment plans were investigated including a simple four-field box treatment, a multiple small field delivery, and a complex IMRT treatment. Dosimeters were imaged within 2 h post irradiation, using consistent scanning techniques (360 projections acquired at 1 degree intervals, reconstruction at 2mm). DFOS efficacy was evaluated through inspection of dose line-profiles, and 2D and 3D dose and gamma maps. DFOS/TPS gamma pass rates with 3%/3mm dose difference/distance-to-agreement criteria ranged from 89.3% to 92.2%, compared to from 95.6% to 99.0% obtained with the commissioned system. The 3D gamma pass rate between the commissioned system and DFOS was 98.2%. The typical noise rates in DFOS reconstructions were up to 3%, compared to under 2% for the commissioned system. In conclusion, while the introduction of a solid tank proved advantageous with regards to cost and convenience, further work is required to improve the image quality and dose reconstruction accuracy of the new DFOS optical-CT system.
Resumo:
CFD modelling of 'real-life' processes often requires solutions in complex three dimensional geometries, which can often result in meshes where aspects of it are badly distorted. Cell-centred finite volume methods, typical of most commercial CFD tools, are computationally efficient, but can lead to convergence problems on meshes which feature cells with high non-orthogonal shapes. The vertex-based finite volume method handles distorted meshes with relative ease, but is computationally expensive. A combined vertex-based - cell-centred (VB-CC) technique, detailed in this paper, allows solutions on distorted meshes that defeat purely cell-centred physical models to be employed in the solution of other transported quantities. The VB-CC method is validated with benchmark solutions for thermally driven flow and turbulent flow. An early application of this hybrid technique is to three-dimensional flow over an aircraft wing, although it is planned to use it in a wide variety of processing applications in the future.
Resumo:
An unstructured cell-centred finite volume method for modelling viscoelastic flow is presented. The method is applied to the flow through a planar channel and the 4:1 planar contraction for creeping flow of an Oldroyd-B fluid. The results are presented for a range of Weissenberg numbers. In the case of the planar channel results are compared with analytical solutions. For the 4:1 planar contraction benchmark problem the convection terms in the constitutive equations are approximated using both first and second order differencing schemes to compare the techniques and the effect of mesh refinement on the solution is investigated. This is the first time that a fully unstructured, cell-centredfinitevolume technique has been used to model the Oldroyd-B fluid for the test cases presented in this paper.
Resumo:
An aerodynamic sound source extraction from a general flow field is applied to a number of model problems and to a problem of engineering interest. The extraction technique is based on a variable decomposition, which results to an acoustic correction method, of each of the flow variables into a dominant flow component and a perturbation component. The dominant flow component is obtained with a general-purpose Computational Fluid Dynamics (CFD) code which uses a cell-centred finite volume method to solve the Reynolds-averaged Navier–Stokes equations. The perturbations are calculated from a set of acoustic perturbation equations with source terms extracted from unsteady CFD solutions at each time step via the use of a staggered dispersion-relation-preserving (DRP) finite-difference scheme. Numerical experiments include (1) propagation of a 1-D acoustic pulse without mean flow, (2) propagation of a 2-D acoustic pulse with/without mean flow, (3) reflection of an acoustic pulse from a flat plate with mean flow, and (4) flow-induced noise generated by the an unsteady laminar flow past a 2-D cavity. The computational results demonstrate the accuracy for model problems and illustrate the feasibility for more complex aeroacoustic problems of the source extraction technique.
Resumo:
Quartz crystal impedance analysis has been developed as a technique to assess whether room-temperature ionic liquids are Newtonian fluids and as a small-volume method for determining the values of their viscosity-density product, rho eta. Changes in the impedance spectrum of a 5-MHz fundamental frequency quartz crystal induced by a water-miscible room-temperature ionic liquid, 1-butyl-3-methylimidazolium. trifluoromethylsulfonate ([C(4)mim][OTf]), were measured. From coupled frequency shift and bandwidth changes as the concentration was varied from 0 to 100% ionic liquid, it was determined that this liquid provided a Newtonian response. A second water-immiscible ionic liquid, 1-butyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide [C(4)mim][NTf2], with concentration varied using methanol, was tested and also found to provide a Newtonian response. In both cases, the values of the square root of the viscosity-density product deduced from the small-volume quartz crystal technique were consistent with those measured using a viscometer and density meter. The third harmonic of the crystal was found to provide the closest agreement between the two measurement methods; the pure ionic liquids had the largest difference of similar to 10%. In addition, 18 pure ionic liquids were tested, and for 11 of these, good-quality frequency shift and bandwidth data were obtained; these 12 all had a Newtonian response. The frequency shift of the third harmonic was found to vary linearly with square root of viscosity-density product of the pure ionic liquids up to a value of root(rho eta) approximate to 18 kg m(-2) s(-1/2), but with a slope 10% smaller than that predicted by the Kanazawa and Gordon equation. It is envisaged that the quartz crystal technique could be used in a high-throughput microfluidic system for characterizing ionic liquids.
Resumo:
The effect of volume shape factor on crystal size distribution (CSD) is usually ignored to simplify the analysis of population balance equation. In the present work, the CSD of fragments generated from a mechanically stirred crystallizer as the result of attrition mechanism has been reported when the volume shape factor conforms to normal distribution. The physical model of GAHN and MERSMANN which relates the attrition resistance of a crystalline substances to its mechanical properties has been employed. The simulation of fragment size distribution was performed by Monte Carlo (MC) technique. The results are compared with those reported by GAHN and MERSMANN.
Resumo:
Although it is well known that sandstone porosity and permeability are controlled by a range of parameters such as grain size and sorting, amount, type, and location of diagenetic cements, extent and type of compaction, and the generation of intergranular and intragranular secondary porosity, it is less constrained how these controlling parameters link up in rock volumes (within and between beds) and how they spatially interact to determine porosity and permeability. To address these unknowns, this study examined Triassic fluvial sandstone outcrops from the UK using field logging, probe permeametry of 200 points, and sampling at 100 points on a gridded rock surface. These field observations were supplemented by laser particle-size analysis, thin-section point-count analysis of primary and diagenetic mineralogy, quantitiative XRD mineral analysis, and SEM/EDAX analysis of all 100 samples. These data were analyzed using global regression, variography, kriging, conditional simulation, and geographically weighted regression to examine the spatial relationships between porosity and permeability and their potential controls. The results of bivariate analysis (global regression) of the entire outcrop dataset indicate only a weak correlation between both permeability porosity and their diagenetic and depositional controls and provide very limited information on the role of primary textural structures such as grain size and sorting. Subdividing the dataset further by bedding unit revealed details of more local controls on porosity and permeability. An alternative geostatistical approach combined with a local modelling technique (geographically weighted regression; GWR) subsequently was used to examine the spatial variability of porosity and permeability and their controls. The use of GWR does not require prior knowledge of divisions between bedding units, but the results from GWR broadly concur with results of regression analysis by bedding unit and provide much greater clarity of how porosity and permeability and their controls vary laterally and vertically. The close relationship between depositional lithofacies in each bed, diagenesis, and permeability, porosity demonstrates that each influences the other, and in turn how understanding of reservoir properties is enhanced by integration of paleoenvironmental reconstruction, stratigraphy, mineralogy, and geostatistics.
Resumo:
Background: Severe sepsis and septic shock are leading causes of death in the intensive care unit (ICU). This is despite advances in the management of patients with severe sepsis and septic shock including early recognition, source control, timely and appropriate administration of antimicrobial agents, and goal directed haemodynamic, ventilatory and metabolic therapies. High-volume haemofiltration (HVHF) is a blood purification technique which may improve outcomes in critically ill patients with severe sepsis or septic shock. The technique of HVHF has evolved from renal replacement therapies used to treat acute kidney injury (AKI) in critically ill patients in the ICU.
Objectives: This review assessed whether HVHF improves clinical outcome in adult critically ill patients with sepsis in an ICU setting.
Search methods: We searched the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library, 2011, Issue 7); MEDLINE (1990 to August 2011), EMBASE (1990 to August 2011); LILACS (1982 to August 2011), Web of Science (1990 to August 2011), CINAHL (1982 to August 2011) and specific websites.
Selection criteria: We included randomized controlled trials (RCTs) and quasi-randomized trials comparing HVHF or high-volume haemodiafiltration to standard or usual dialysis therapy; and RCTs and quasi-randomized trials comparing HVHF or high-volume haemodiafiltration to no similar dialysis therapy. The studies involved adults in critical care units.
Data collection and analysis: Three review authors independently extracted data and assessed trial quality. We sought additional information as required from trialists.
Main results: We included three randomized trials involving 64 participants. Due to the small number of studies and participants, it was not possible to combine data or perform sub-group analyses. One trial reported ICU and 28-day mortality, one trial reported hospital mortality and in the third, the number of deaths stated did not match the quoted mortality rates. No trials reported length of stay in ICU or hospital and one reported organ dysfunction. No adverse events were reported. Overall, the included studies had a low risk of bias.
Authors' conclusions: There were no adverse effects of HVHF reported.There is insufficient evidence to recommend the use of HVHF in critically ill patients with severe sepsis and or septic shock except as interventions being investigated in the setting of a randomized clinical trial. These trials should be large, multi-centred and have clinically relevant outcome measures. Financial implications should also be assessed.
Resumo:
Many cardiovascular diseases are characterised by the restriction of blood flow through arteries. Stents can be expanded within arteries to remove such restrictions; however, tissue in-growth into the stent can lead to restenosis. In order to predict the long-term efficacy of stenting, a mechanobiological model of the arterial tissue reaction to stress is required. In this study, a computational model of arterial tissue response to stenting is applied to three clinically relevant stent designs. We ask the question whether such a mechanobiological model can differentiate between stents used clinically, and we compare these predictions to a purely mechanical analysis. In doing so, we are testing the hypothesis that a mechanobiological model of arterial tissue response to injury could predict the long-term outcomes of stent design. Finite element analysis of the expansion of three different stent types was performed in an idealised, 3D artery. Injury was calculated in the arterial tissue using a remaining-life damage mechanics approach. The inflammatory response to this initial injury was modelled using equations governing variables which represented tissue-degrading species and growth factors. Three levels of inflammation response were modelled to account for inter-patient variability. A lattice-based model of smooth muscle cell behaviour was implemented, treating cells as discrete agents governed by local rules. The simulations predicted differences between stent designs similar to those found in vivo. It showed that the volume of neointima produced could be quantified, providing a quantitative comparison of stents. In contrast, the differences between stents based on stress alone were highly dependent on the choice of comparison criteria. These results show that the choice of stress criteria for stent comparisons is critical. This study shows that mechanobiological modelling may provide a valuable tool in stent design, allowing predictions of their long-term efficacy. The level of inflammation was shown to affect the sensitivity of the model to stent design. If this finding was verified in patients, this could suggest that high-inflammation patients may require alternative treatments to stenting.
Resumo:
PURPOSE: To report a new technique to correct tube position in anterior chamber after glaucoma drainage device implantation.
PATIENT AND METHODS: A patient who underwent a glaucoma drainage device implantation was noted to have the tube touching the corneal endothelium. A 10/0 polypropylene suture with double-armed 3-inch long straight needle was placed transcamerally from limbus to limbus, in the superior part of the eye, passing the needle in front of the tube.
RESULTS: The position of the tube in the anterior chamber was corrected with optimal distance from corneal endothelium and iris surface. The position remained satisfactory after 20 months of follow-up.
CONCLUSIONS: The placement of a transcameral suture offers a safe, quick, and minimal invasive intervention for the correction of the position of a glaucoma drainage device tube in the anterior chamber.
Resumo:
Biosignal measurement and processing is increasingly being deployed in ambulatory situations particularly in connected health applications. Such an environment dramatically increases the likelihood of artifacts which can occlude features of interest and reduce the quality of information available in the signal. If multichannel recordings are available for a given signal source, then there are currently a considerable range of methods which can suppress or in some cases remove the distorting effect of such artifacts. There are, however, considerably fewer techniques available if only a single-channel measurement is available and yet single-channel measurements are important where minimal instrumentation complexity is required. This paper describes a novel artifact removal technique for use in such a context. The technique known as ensemble empirical mode decomposition with canonical correlation analysis (EEMD-CCA) is capable of operating on single-channel measurements. The EEMD technique is first used to decompose the single-channel signal into a multidimensional signal. The CCA technique is then employed to isolate the artifact components from the underlying signal using second-order statistics. The new technique is tested against the currently available wavelet denoising and EEMD-ICA techniques using both electroencephalography and functional near-infrared spectroscopy data and is shown to produce significantly improved results. © 1964-2012 IEEE.
Resumo:
This study rigorously evaluated a previously developed immunobead array method to simultaneously detect three important foodborne pathogens, Campylobacter jejuni, Listeria monocytogenes, and Salmonella spp., for its actual application in routine food testing. Due to the limitation of the detection limit of the developed method, an enrichment step was included in this study by using Campylobacter Enrichment Broth for C. jejuni and Universal Pre-enrichment Broth for L. monocytogenes and Salmonella spp.. The findings show that the immunobead array method was capable of detecting as low as 1 CFU of the pathogens spiked in the culture media after being cultured for 24 hours for all three pathogens. The immunobead array method was further evaluated for its pathogen detection capabilities in ready-to-eat (RTE) and ready-to-cook (RTC) chicken samples and proven to be able to detect as low as 1 CFU of the pathogens spiked in the food samples after being cultured for 24 hours in the case of Salmonella spp., and L. monocytogenes and 48 hours in the case of C. jejuni. The method was subsequently validated with three types of chicken products (RTE, n=30; RTC, n=20; raw chicken, n=20) and was found to give the same results as the conventional plating method. Our findings demonstrated that the previously developed immunobead array method could be used for actual food testing with minimal enrichment period of only 52 hours, whereas the conventional ISO protocols for the same pathogens take 90-144 hours. The immunobead array was therefore an inexpensive, rapid and simple method for the food testing.
Resumo:
Dissertação de Mestrado, Engenharia Biológica, Faculdade de Engenharia de Recursos Naturais, Universidade do Algarve, 2008