901 resultados para radius of curvature measurement


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Formation resistivity is one of the most important parameters to be evaluated in the evaluation of reservoir. In order to acquire the true value of virginal formation, various types of resistivity logging tools have been developed. However, with the increment of the proved reserves, the thickness of interest pay zone is becoming thinner and thinner, especially in the terrestrial deposit oilfield, so that electrical logging tools, limited by the contradictory requirements of resolution and investigation depth of this kinds of tools, can not provide the true value of the formation resistivity. Therefore, resitivity inversion techniques have been popular in the determination of true formation resistivity based on the improving logging data from new tools. In geophysical inverse problems, non-unique solution is inevitable due to the noisy data and deficient measurement information. I address this problem in my dissertation from three aspects, data acquisition, data processing/inversion and applications of the results/ uncertainty evaluation of the non-unique solution. Some other problems in the traditional inversion methods such as slowness speed of the convergence and the initial-correlation results. Firstly, I deal with the uncertainties in the data to be processed. The combination of micro-spherically focused log (MSFL) and dual laterolog(DLL) is the standard program to determine formation resistivity. During the inversion, the readings of MSFL are regarded as the resistivity of invasion zone of the formation after being corrected. However, the errors can be as large as 30 percent due to mud cake influence even if the rugose borehole effects on the readings of MSFL can be ignored. Furthermore, there still are argues about whether the two logs can be quantitatively used to determine formation resisitivities due to the different measurement principles. Thus, anew type of laterolog tool is designed theoretically. The new tool can provide three curves with different investigation depths and the nearly same resolution. The resolution is about 0.4meter. Secondly, because the popular iterative inversion method based on the least-square estimation can not solve problems more than two parameters simultaneously and the new laterolog logging tool is not applied to practice, my work is focused on two parameters inversion (radius of the invasion and the resistivty of virgin information ) of traditional dual laterolog logging data. An unequal weighted damp factors- revised method is developed to instead of the parameter-revised techniques used in the traditional inversion method. In this new method, the parameter is revised not only dependency on the damp its self but also dependency on the difference between the measurement data and the fitting data in different layers. At least 2 iterative numbers are reduced than the older method, the computation cost of inversion is reduced. The damp least-squares inversion method is the realization of Tikhonov's tradeoff theory on the smooth solution and stability of inversion process. This method is realized through linearity of non-linear inversion problem which must lead to the dependency of solution on the initial value of parameters. Thus, severe debates on efficiency of this kinds of methods are getting popular with the developments of non-linear processing methods. The artificial neural net method is proposed in this dissertation. The database of tool's response to formation parameters is built through the modeling of the laterolog tool and then is used to training the neural nets. A unit model is put forward to simplify the dada space and an additional physical limitation is applied to optimize the net after the cross-validation method is done. Results show that the neural net inversion method could replace the traditional inversion method in a single formation and can be used a method to determine the initial value of the traditional method. No matter what method is developed, the non-uniqueness and uncertainties of the solution could be inevitable. Thus, it is wise to evaluate the non-uniqueness and uncertainties of the solution in the application of inversion results. Bayes theorem provides a way to solve such problems. This method is illustrately discussed in a single formation and achieve plausible results. In the end, the traditional least squares inversion method is used to process raw logging data, the calculated oil saturation increased 20 percent than that not be proceed compared to core analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The micro-pore configurations on the matrix surface were studied by SEM. The matrix of molten carbonate fuel cell (MCFC) performance was also improved by the better coordination between the reasonable radius of the micro-pores and the higher porosity of the cell matrix. The many and complicated micro-pore configurations in the cell matrix promoted the volatilization of the organic additives and the burn of polyvinyl butyral (PVB). The smooth volatilization of the organic additives and the complete burn of PVB were the significant factors for the improved MCFC performance. Oxygen diffusion controlled-burn mechanism of PVB in the cell matrix was proposed. (C) 2002 Published by Elsevier Science Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The interpretation and recognition of noisy contours, such as silhouettes, have proven to be difficult. One obstacle to the solution of these problems has been the lack of a robust representation for contours. The contour is represented by a set of pairwise tangent circular arcs. The advantage of such an approach is that mathematical properties such as orientation and curvature are explicityly represented. We introduce a smoothing criterion for the contour tht optimizes the tradeoff between the complexity of the contour and proximity of the data points. The complexity measure is the number of extrema of curvature present in the contour. The smoothing criterion leads us to a true scale-space for contours. We describe the computation of the contour representation as well as the computation of relevant properties of the contour. We consider the potential application of the representation, the smoothing paradigm, and the scale-space to contour interpretation and recognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Compliant control is a standard method for performing fine manipulation tasks, like grasping and assembly, but it requires estimation of the state of contact between the robot arm and the objects involved. Here we present a method to learn a model of the movement from measured data. The method requires little or no prior knowledge and the resulting model explicitly estimates the state of contact. The current state of contact is viewed as the hidden state variable of a discrete HMM. The control dependent transition probabilities between states are modeled as parametrized functions of the measurement We show that their parameters can be estimated from measurements concurrently with the estimation of the parameters of the movement in each state of contact. The learning algorithm is a variant of the EM procedure. The E step is computed exactly; solving the M step exactly would require solving a set of coupled nonlinear algebraic equations in the parameters. Instead, gradient ascent is used to produce an increase in likelihood.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this thesis is to apply the computational approach to motor learning, i.e., describe the constraints that enable performance improvement with experience and also the constraints that must be satisfied by a motor learning system, describe what is being computed in order to achieve learning, and why it is being computed. The particular tasks used to assess motor learning are loaded and unloaded free arm movement, and the thesis includes work on rigid body load estimation, arm model estimation, optimal filtering for model parameter estimation, and trajectory learning from practice. Learning algorithms have been developed and implemented in the context of robot arm control. The thesis demonstrates some of the roles of knowledge in learning. Powerful generalizations can be made on the basis of knowledge of system structure, as is demonstrated in the load and arm model estimation algorithms. Improving the performance of parameter estimation algorithms used in learning involves knowledge of the measurement noise characteristics, as is shown in the derivation of optimal filters. Using trajectory errors to correct commands requires knowledge of how command errors are transformed into performance errors, i.e., an accurate model of the dynamics of the controlled system, as is demonstrated in the trajectory learning work. The performance demonstrated by the algorithms developed in this thesis should be compared with algorithms that use less knowledge, such as table based schemes to learn arm dynamics, previous single trajectory learning algorithms, and much of traditional adaptive control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to assess the appearance of cardiac troponins (cTnI and/or cTnT) after a short bout (30 s) of ‘all-out’ intense exercise and to determine the stability of any exercise-related cTnI release in response to repeated bouts of high intensity exercise separated by 7 days recovery. Eighteen apparently healthy, physically active, male university students completed two all-out 30 s cycle sprint, separated by 7 days. cTnI, blood lactate and catecholamine concentrations were measured before, immediately after and 24 h after each bout. Cycle performance, heart rate and blood pressure responses to exercise were also recorded. Cycle performance was modestly elevated in the second trial [6·5% increase in peak power output (PPO)]; there was no difference in the cardiovascular, lactate or catecholamine response to the two cycle trials. cTnI was not significantly elevated from baseline through recovery (Trial 1: 0·06 ± 0·04 ng ml−1, 0·05 ± 0·04 ng ml−1, 0·03 ± 0·02 ng ml−1; Trial 2: 0·02 ± 0·04 ng ml−1, 0·04 ± 0·03 ng ml−1, 0·05 ± 0·06 ng ml−1) in either trial. Very small within subject changes were not significantly correlated between the two trials (r = 0·06; P>0·05). Subsequently, short duration, high intensity exercise does not elicit a clinically relevant response in cTnI and any small alterations likely reflect the underlying biological variability of cTnI measurement within the participants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The topic of this thesis is impulsivity. The meaning and measurement of impulse control is explored, with a particular focus on forensic settings. Impulsivity is central to many areas of psychology; it is one of the most common diagnostic criteria of mental disorders and is fundamental to the understanding of forensic personalities. Despite this widespread importance there is little agreement as to the definition or structure of impulsivity, and its measurement is fraught with difficulty owing to a reliance on self-report methods. This research aims to address this problem by investigating the viability of using simple computerised cognitive performance tasks as complementary components of a multi-method assessment strategy for impulse control. Ultimately, the usefulness of this measurement strategy for a forensic sample is assessed. Impulsivity is found to be a multifaceted construct comprised of a constellation of distinct sub-dimensions. Computerised cognitive performance tasks are valid and reliable measures that can assess impulsivity at a neuronal level. Self-report and performance task methods assess distinct components of impulse control and, for the optimal assessment of impulse control, a multi-method battery of self-report and performance task measures is advocated. Such a battery is shown to have demonstrated utility in a forensic sample, and recommendations for forensic assessment in the Irish context are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: To investigate the value of using PROMs as quality improvement tools. Methods: Two systematic reviews were undertaken. The first reviewed the quantitative literature on the impact of PROMs feedback and the second reviewed the qualitative literature on the use of PROMs in practice. These reviews informed the focus of the primary research. A cluster randomised controlled trial (PROFILE) examined the impact of providing peer benchmarked PROMs feedback to consultant orthopaedic surgeons on improving outcomes for hip replacement surgery. Qualitative interviews with surgeons in the intervention arm of the trial examined the view of and reactions to the feedback. Results: The quantitative review of 17 studies found weak evidence to suggest that providing PROMs feedback to professionals improves patient outcomes. The qualitative review of 16 studies identified the barriers and facilitators to the use of PROMs based on four themes: practical considerations, attitudes towards the data, methodological concerns and the impact of feedback on care. The PROFILE trial included 11 surgeons and 215 patients in the intervention arm, and 10 surgeons and 217 patients in the control arm. The trial found no significant difference in the Oxford Hip Score between the arms (-0.7, 95% CI -1.9-0.5, p=0.2). Interviews with surgeons revealed mixed opinions about the value of the PROMs feedback and the information did not promote explicit changes to their practice. Conclusion: It is important to use PROMs which have been validated for the specific purpose of performance measurement, consult with professionals when developing a PROMs feedback intervention, communicate with professionals about the objectives of the data collection, educate professionals on the properties and interpretation of the data, and support professionals in using the information to improve care. It is also imperative that the burden of data collection and dissemination of the information is minimised.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Copayments for prescriptions are associated with decreased adherence to medicines resulting in increased health service utilisation, morbidity and mortality. In October 2010 a 50c copayment per prescription item was introduced on the General Medical Services (GMS) scheme in Ireland, the national public health insurance programme for low-income and older people. The copayment was increased to €1.50 per prescription item in January 2013. To date, the impact of these copayments on adherence to prescription medicines on the GMS scheme has not been assessed. Given that the GMS population comprises more than 40% of the Irish population, this presents an important public health problem. The aim of this thesis was to assess the impact of two prescription copayments, 50c and €1.50, on adherence to medicines.Methods: In Chapter 2 the published literature was systematically reviewed with meta-analysis to a) develop evidence on cost-sharing for prescriptions and adherence to medicines and b) develop evidence for an alternative policy option; removal of copayments. The core research question of this thesis was addressed by a large before and after longitudinal study, with comparator group, using the national pharmacy claims database. New users of essential and less-essential medicines were included in the study with sample sizes ranging from 7,007 to 136,111 individuals in different medication groups. Segmented regression was used with generalised estimating equations to allow for correlations between repeated monthly measurements of adherence. A qualitative study involving 24 individuals was conducted to assess patient attitudes towards the 50c copayment policy. The qualitative and quantitative findings were integrated in the discussion chapter of the thesis. The vast majority of the literature on this topic area is generated in North America, therefore a test of generalisability was carried out in Chapter 5 by comparing the impact of two similar copayment interventions on adherence, one in the U.S. and one in Ireland. The method used to measure adherence in Chapters 3 and 5 was validated in Chapter 6. Results: The systematic review with meta-analysis demonstrated an 11% (95% CI 1.09 to 1.14) increased odds of non-adherence when publicly insured populations were exposed to copayments. The second systematic review found moderate but variable improvements in adherence after removal/reduction of copayments in a general population. The core paper of this thesis found that both the 50c and €1.50 copayments on the GMS scheme were associated with larger reductions in adherence to less-essential medicines than essential medicines directly after the implementation of policies. An important exception to this pattern was observed; adherence to anti-depressant medications declined by a larger extent than adherence to other essential medicines after both copayments. The cross country comparison indicated that North American evidence on cost-sharing for prescriptions is not automatically generalisable to the Irish setting. Irish patients had greater immediate decreases of -5.3% (95% CI -6.9 to -3.7) and -2.8% (95% CI -4.9 to -0.7) in adherence to anti-hypertensives and anti-hyperlipidaemic medicines, respectively, directly after the policy changes, relative to their U.S. counterparts. In the long term, however, the U.S. and Irish populations had similar behaviours. The concordance study highlighted the possibility of a measurement bias occurring for the measurement of adherence to non-steroidal anti-inflammatory drugs in Chapter 3. Conclusions: This thesis has presented two reviews of international cost-sharing policies, an assessment of the generalisability of international evidence and both qualitative and quantitative examinations of cost-sharing policies for prescription medicines on the GMS scheme in Ireland. It was found that the introduction of a 50c copayment and its subsequent increase to €1.50 on the GMS scheme had a larger impact on adherence to less-essential medicines relative to essential medicines, with the exception of anti-depressant medications. This is in line with policy objectives to reduce moral hazard and is therefore demonstrative of the value of such policies. There are however some caveats. The copayment now stands at €2.50 per prescription item. The impact of this increase in copayment has yet to be assessed which is an obvious point for future research. Careful monitoring for adverse effects in socio-economically disadvantaged groups within the GMS population is also warranted. International evidence can be applied to the Irish setting to aid in future decision making in this area, but not without placing it in the local context first. Patients accepted the introduction of the 50c charge, however did voice concerns over a rising price. The challenge for policymakers is to find the ‘optimal copayment’ – whereby moral hazard is decreased, but access to essential chronic disease medicines that provide advantages at the population level is not deterred. This evidence presented in this thesis will be utilisable for future policy-making in Ireland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real time monitoring of oxygenation and respiration is on the cutting edge of bioanalysis, including studies of cell metabolism, bioenergetics, mitochondrial function and drug toxicity. This thesis presents the development and evaluation of new luminescent probes and techniques for intracellular O2 sensing and imaging. A new oxygen consumption rate (OCR) platform based on the commercial microfluidic perfusion channel μ-slides compatible with extra- and intracellular O2 sensitive probes, different cell lines and measurement conditions was developed. The design of semi-closed channels allowed cell treatments, multiplexing with other assays and two-fold higher sensitivity to compare with microtiter plate. We compared three common OCR platforms: hermetically sealed quartz cuvettes for absolute OCRs, partially sealed with mineral oil 96-WPs for relative OCRs, and open 96-WPs for local cell oxygenation. Both 96-WP platforms were calibrated against absolute OCR platform with MEF cell line, phosphorescent O2 probe MitoXpress-Intra and time-resolved fluorescence reader. Found correlations allow tracing of cell respiration over time in a high throughput format with the possibility of cell stimulation and of changing measurement conditions. A new multimodal intracellular O2 probe, based on the phosphorescent reporter dye PtTFPP, fluorescent FRET donor and two-photon antennae PFO and cationic nanoparticles RL-100 was described. This probe, called MM2, possesses high brightness, photo- and chemical stability, low toxicity, efficient cell staining and high-resolution intracellular O2 imaging with 2D and 3D cell cultures in intensity, ratiometric and lifetime-based modalities with luminescence readers and FLIM microscopes. Extended range of O2 sensitive probes was designed and studied in order to optimize their spectral characteristics and intracellular targeting, using different NPs materials, delivery vectors, ratiometric pairs and IR dyes. The presented improvements provide useful tool for high sensitive monitoring and imaging of intracellular O2 in different measurement formats with wide range of physiological applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Capable of three-dimensional imaging of the cornea with micrometer-scale resolution, spectral domain-optical coherence tomography (SDOCT) offers potential advantages over Placido ring and Scheimpflug photography based systems for accurate extraction of quantitative keratometric parameters. In this work, an SDOCT scanning protocol and motion correction algorithm were implemented to minimize the effects of patient motion during data acquisition. Procedures are described for correction of image data artifacts resulting from 3D refraction of SDOCT light in the cornea and from non-idealities of the scanning system geometry performed as a pre-requisite for accurate parameter extraction. Zernike polynomial 3D reconstruction and a recursive half searching algorithm (RHSA) were implemented to extract clinical keratometric parameters including anterior and posterior radii of curvature, central cornea optical power, central corneal thickness, and thickness maps of the cornea. Accuracy and repeatability of the extracted parameters obtained using a commercial 859nm SDOCT retinal imaging system with a corneal adapter were assessed using a rigid gas permeable (RGP) contact lens as a phantom target. Extraction of these parameters was performed in vivo in 3 patients and compared to commercial Placido topography and Scheimpflug photography systems. The repeatability of SDOCT central corneal power measured in vivo was 0.18 Diopters, and the difference observed between the systems averaged 0.1 Diopters between SDOCT and Scheimpflug photography, and 0.6 Diopters between SDOCT and Placido topography.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

On-board image guidance, such as cone-beam CT (CBCT) and kV/MV 2D imaging, is essential in many radiation therapy procedures, such as intensity modulated radiotherapy (IMRT) and stereotactic body radiation therapy (SBRT). These imaging techniques provide predominantly anatomical information for treatment planning and target localization. Recently, studies have shown that treatment planning based on functional and molecular information about the tumor and surrounding tissue could potentially improve the effectiveness of radiation therapy. However, current on-board imaging systems are limited in their functional and molecular imaging capability. Single Photon Emission Computed Tomography (SPECT) is a candidate to achieve on-board functional and molecular imaging. Traditional SPECT systems typically take 20 minutes or more for a scan, which is too long for on-board imaging. A robotic multi-pinhole SPECT system was proposed in this dissertation to provide shorter imaging time by using a robotic arm to maneuver the multi-pinhole SPECT system around the patient in position for radiation therapy.

A 49-pinhole collimated SPECT detector and its shielding were designed and simulated in this work using the computer-aided design (CAD) software. The trajectories of robotic arm about the patient, treatment table and gantry in the radiation therapy room and several detector assemblies such as parallel holes, single pinhole and 49 pinholes collimated detector were investigated. The rail mounted system was designed to enable a full range of detector positions and orientations to various crucial treatment sites including head and torso, while avoiding collision with linear accelerator (LINAC), patient table and patient.

An alignment method was developed in this work to calibrate the on-board robotic SPECT to the LINAC coordinate frame and to the coordinate frames of other on-board imaging systems such as CBCT. This alignment method utilizes line sources and one pinhole projection of these line sources. The model consists of multiple alignment parameters which maps line sources in 3-dimensional (3D) space to their 2-dimensional (2D) projections on the SPECT detector. Computer-simulation studies and experimental evaluations were performed as a function of number of line sources, Radon transform accuracy, finite line-source width, intrinsic camera resolution, Poisson noise and acquisition geometry. In computer-simulation studies, when there was no error in determining angles (α) and offsets (ρ) of the measured projections, the six alignment parameters (3 translational and 3 rotational) were estimated perfectly using three line sources. When angles (α) and offsets (ρ) were provided by Radon transform, the estimation accuracy was reduced. The estimation error was associated with rounding errors of Radon transform, finite line-source width, Poisson noise, number of line sources, intrinsic camera resolution and detector acquisition geometry. The estimation accuracy was significantly improved by using 4 line sources rather than 3 and also by using thinner line-source projections (obtained by better intrinsic detector resolution). With 5 line sources, median errors were 0.2 mm for the detector translations, 0.7 mm for the detector radius of rotation, and less than 0.5° for detector rotation, tilt and twist. In experimental evaluations, average errors relative to a different, independent registration technique were about 1.8 mm for detector translations, 1.1 mm for the detector radius of rotation (ROR), 0.5° and 0.4° for detector rotation and tilt, respectively, and 1.2° for detector twist.

Simulation studies were performed to investigate the improvement of imaging sensitivity and accuracy of hot sphere localization for breast imaging of patients in prone position. A 3D XCAT phantom was simulated in the prone position with nine hot spheres of 10 mm diameter added in the left breast. A no-treatment-table case and two commercial prone breast boards, 7 and 24 cm thick, were simulated. Different pinhole focal lengths were assessed for root-mean-square-error (RMSE). The pinhole focal lengths resulting in the lowest RMSE values were 12 cm, 18 cm and 21 cm for no table, thin board, and thick board, respectively. In both no table and thin board cases, all 9 hot spheres were easily visualized above background with 4-minute scans utilizing the 49-pinhole SPECT system while seven of nine hot spheres were visible with the thick board. In comparison with parallel-hole system, our 49-pinhole system shows reduction in noise and bias under these simulation cases. These results correspond to smaller radii of rotation for no-table case and thinner prone board. Similarly, localization accuracy with the 49-pinhole system was significantly better than with the parallel-hole system for both the thin and thick prone boards. Median localization errors for the 49-pinhole system with the thin board were less than 3 mm for 5 of 9 hot spheres, and less than 6 mm for the other 4 hot spheres. Median localization errors of 49-pinhole system with the thick board were less than 4 mm for 5 of 9 hot spheres, and less than 8 mm for the other 4 hot spheres.

Besides prone breast imaging, respiratory-gated region-of-interest (ROI) imaging of lung tumor was also investigated. A simulation study was conducted on the potential of multi-pinhole, region-of-interest (ROI) SPECT to alleviate noise effects associated with respiratory-gated SPECT imaging of the thorax. Two 4D XCAT digital phantoms were constructed, with either a 10 mm or 20 mm diameter tumor added in the right lung. The maximum diaphragm motion was 2 cm (for 10 mm tumor) or 4 cm (for 20 mm tumor) in superior-inferior direction and 1.2 cm in anterior-posterior direction. Projections were simulated with a 4-minute acquisition time (40 seconds per each of 6 gates) using either the ROI SPECT system (49-pinhole) or reference single and dual conventional broad cross-section, parallel-hole collimated SPECT. The SPECT images were reconstructed using OSEM with up to 6 iterations. Images were evaluated as a function of gate by profiles, noise versus bias curves, and a numerical observer performing a forced-choice localization task. Even for the 20 mm tumor, the 49-pinhole imaging ROI was found sufficient to encompass fully usual clinical ranges of diaphragm motion. Averaged over the 6 gates, noise at iteration 6 of 49-pinhole ROI imaging (10.9 µCi/ml) was approximately comparable to noise at iteration 2 of the two dual and single parallel-hole, broad cross-section systems (12.4 µCi/ml and 13.8 µCi/ml, respectively). Corresponding biases were much lower for the 49-pinhole ROI system (3.8 µCi/ml), versus 6.2 µCi/ml and 6.5 µCi/ml for the dual and single parallel-hole systems, respectively. Median localization errors averaged over 6 gates, for the 10 mm and 20 mm tumors respectively, were 1.6 mm and 0.5 mm using the ROI imaging system and 6.6 mm and 2.3 mm using the dual parallel-hole, broad cross-section system. The results demonstrate substantially improved imaging via ROI methods. One important application may be gated imaging of patients in position for radiation therapy.

A robotic SPECT imaging system was constructed utilizing a gamma camera detector (Digirad 2020tc) and a robot (KUKA KR150-L110 robot). An imaging study was performed with a phantom (PET CT PhantomTM), which includes 5 spheres of 10, 13, 17, 22 and 28 mm in diameter. The phantom was placed on a flat-top couch. SPECT projections were acquired with a parallel-hole collimator and a single-pinhole collimator both without background in the phantom, and with background at 1/10th the sphere activity concentration. The imaging trajectories of parallel-hole and pinhole collimated detectors spanned 180 degrees and 228 degrees respectively. The pinhole detector viewed a 14.7 cm-diameter common volume which encompassed the 28 mm and 22 mm spheres. The common volume for parallel-hole was a 20.8-cm-diameter cylinder which encompassed all five spheres in the phantom. The maneuverability of the robotic system was tested by navigating the detector to trace the flat-top table while avoiding collision with the table and maintaining the closest possible proximity to the common volume. For image reconstruction, detector trajectories were described by radius-of-rotation and detector rotation angle θ. These reconstruction parameters were obtained from the robot base and tool coordinates. The robotic SPECT system was able to maneuver the parallel-hole and pinhole collimated SPECT detectors in close proximity to the phantom, minimizing impact of the flat-top couch on detector to center-of-rotation (COR) distance. In no background case, all five spheres were visible in the reconstructed parallel-hole and pinhole images. In with background case, three spheres of 17, 22 and 28 mm diameter were readily observed with the parallel-hole imaging, and the targeted spheres (22 and 28 mm diameter) were readily observed in the pinhole ROI imaging.

In conclusion, the proposed on-board robotic SPECT can be aligned to LINAC/CBCT with a single pinhole projection of the line-source phantom. Alignment parameters can be estimated using one pinhole projection of line sources. This alignment method may be important for multi-pinhole SPECT, where relative pinhole alignment may vary during rotation. For single pinhole and multi-pinhole SPECT imaging onboard radiation therapy machines, the method could provide alignment of SPECT coordinates with those of CBCT and the LINAC. In simulation studies of prone breast imaging and respiratory-gated lung imaging, the 49-pinhole detector showed better tumor contrast recovery and localization in a 4-minute scan compared to parallel-hole detector. On-board SPECT could be achieved by a robot maneuvering a SPECT detector about patients in position for radiation therapy on a flat-top couch. The robot inherent coordinate frames could be an effective means to estimate detector pose for use in SPECT image reconstruction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thin-layer and high-performance thin-layer chromatography (TLC/HPTLC) methods for assaying compound(s) in a sample must be validated to ensure that they are fit for their intended purpose and, where applicable, meet the strict regulatory requirements for controlled products. Two validation approaches are identified in the literature, i.e. the classic and the alternative, which is using accuracy profiles.Detailed procedures of the two approaches are discussed based on the validation of methods for pharmaceutical analysis, which is an area considered having more strict requirements. Estimation of the measurement uncertainty from the validation approach using accuracy profiles is also described.Examples of HPTLC methods, developed and validated to assay sulfamethoxazole and trimethoprim on the one hand and lamivudine, stavudine, and nevirapine on the other, in their fixed-dose combination tablets, are further elaborated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the problem of finding the heat distribution and the shape of the liquid fraction during laser welding of a thick steel plate using the finite volume CFD package PHYSICA. Since the shape of the keyhole is not known in advance, the following two-step approach to handling this problem has been employed. In the first stage, we determine the geometry of the keyhole for the steady-state case and form an appropriate mesh that includes both the workpiece and the keyhole. In the second stage, we impose the boundary conditions by assigning temperature to the walls of the keyhole and find the heat distribution and the shape of the liquid fraction for a given welding speed and material properties. We construct a fairly accurate approximation of the keyhole as a sequence of include sliced cones. A formula for finding the initial radius of the keyhole is derived by determining the radius of the vaporisation isotherm for the line heat source. We report on the results of a series of computational experiments for various heat input values and welding velocities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Direct chill (DC) casting is a core primary process in the production of aluminum ingots. However, its operational optimization is still under investigation with regard to a number of features, one of which is the issue of curvature at the base of the ingot. Analysis of these features requires a computational model of the process that accounts for the fluid flow, heat transfer, solidification phase change, and thermomechanical analysis. This article describes an integrated approach to the modeling of all the preceding phenomena and their interactions.