32 resultados para Double cantilever beam test

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The likelihood ratio test of cointegration rank is the most widely used test for cointegration. Many studies have shown that its finite sample distribution is not well approximated by the limiting distribution. The article introduces and evaluates by Monte Carlo simulation experiments bootstrap and fast double bootstrap (FDB) algorithms for the likelihood ratio test. It finds that the performance of the bootstrap test is very good. The more sophisticated FDB produces a further improvement in cases where the performance of the asymptotic test is very unsatisfactory and the ordinary bootstrap does not work as well as it might. Furthermore, the Monte Carlo simulations provide a number of guidelines on when the bootstrap and FDB tests can be expected to work well. Finally, the tests are applied to US interest rates and international stock prices series. It is found that the asymptotic test tends to overestimate the cointegration rank, while the bootstrap and FDB tests choose the correct cointegration rank.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study concentrates on the contested concept of pastiche in literary studies. It offers the first detailed examination of the history of the concept from its origins in the seventeenth century to the present, showing how pastiche emerged as a critical concept in interaction with the emerging conception of authorial originality and the copyright laws protecting it. One of the key results of this investigation is the contextualisation of the postmodern debate on pastiche. Even though postmodern critics often emphasise the radical novelty of pastiche, they in fact resuscitate older positions and arguments without necessarily reflecting on their historical conditions. This historical background is then used to analyse the distinction between the primarily French conception of pastiche as the imitation of style and the postmodern notion of it as the compilation of different elements. The latter s vagueness and inclusiveness detracts from its value as a critical concept. The study thus concentrates on the notion of stylistic pastiche, challenging the widespread prejudice that it is merely an indication of lack of talent. Because it is multiply based on repetition, pastiche is in fact a highly ambiguous or double-edged practice that calls into question the distinction between repetition and original, thereby undermining the received notion of individual unique authorship as a fundamental aesthetic value. Pastiche does not, however, constitute a radical upheaval of the basic assumptions on which the present institution of literature relies, since, in order to mark its difference, pastiche always refers to a source outside itself against which its difference is measured. Finally, the theoretical analysis of pastiche is applied to literary works. The pastiches written by Marcel Proust demonstrate how it can become an integral part of a writer s poetics: imitation of style is shown to provide Proust with a way of exploring the role of style as a connecting point between inner vision and reality. The pastiches of the Sherlock Holmes stories by Michael Dibdin, Nicholas Meyer and the duo Adrian Conan Doyle and John Dickson Carr illustrate the functions of pastiche within a genre detective fiction that is itself fundamentally repetitive. A.S. Byatt s Possession and D.M. Thomas s Charlotte use Victorian pastiches to investigate the conditions of literary creation in the age of postmodern suspicion of creativity and individuality. The study thus argues that the concept of pastiche has valuable insights to offer to literary criticism and theory, and that literary pastiches, though often dismissed in reviews and criticism, are a particularly interesting object of study precisely because of their characteristic ambiguity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multiple sclerosis (MS) is a chronic, inflammatory disease of the central nervous system, characterized especially by myelin and axon damage. Cognitive impairment in MS is common but difficult to detect without a neuropsychological examination. Valid and reliable methods are needed in clinical practice and research to detect deficits, follow their natural evolution, and verify treatment effects. The Paced Auditory Serial Addition Test (PASAT) is a measure of sustained and divided attention, working memory, and information processing speed, and it is widely used in MS patients neuropsychological evaluation. Additionally, the PASAT is the sole cognitive measure in an assessment tool primarly designed for MS clinical trials, the Multiple Sclerosis Functional Composite (MSFC). The aims of the present study were to determine a) the frequency, characteristics, and evolution of cognitive impairment among relapsing-remitting MS patients, and b) the validity and reliability of the PASAT in measuring cognitive performance in MS patients. The subjects were 45 relapsing-remitting MS patients from Seinäjoki Central Hospital, Department of Neurology and 48 healthy controls. Both groups underwent comprehensive neuropsychological assessments, including the PASAT, twice in a one-year follow-up, and additionally a sample of 10 patients and controls were evaluated with the PASAT in serial assessments five times in one month. The frequency of cognitive dysfunction among relapsing-remitting MS patients in the present study was 42%. Impairments were characterized especially by slowed information processing speed and memory deficits. During the one-year follow-up, the cognitive performance was relatively stable among MS patients on a group level. However, the practice effects in cognitive tests were less pronounced among MS patients than healthy controls. At an individual level the spectrum of MS patients cognitive deficits was wide in regards to their characteristics, severity, and evolution. The PASAT was moderately accurate in detecting MS-associated cognitive impairment, and 69% of patients were correctly classified as cognitively impaired or unimpaired when comprehensive neuropsychological assessment was used as a "gold standard". Self-reported nervousness and poor arithmetical skills seemed to explain misclassifications. MS-related fatigue was objectively demonstrated as fading performance towards the end of the test. Despite the observed practice effect, the reliability of the PASAT was excellent, and it was sensitive to the cognitive decline taking place during the follow-up in a subgroup of patients. The PASAT can be recommended for use in the neuropsychological assessment of MS patients. The test is fairly sensitive, but less specific; consequently, the reasons for low scores have to be carefully identified before interpreting them as clinically significant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advanced stage head and neck cancers (HNC) with distant metastasis, as well as prostate cancers (PC), are devastating diseases currently lacking efficient treatment options. One promising developmental approach in cancer treatment is the use of oncolytic adenoviruses, especially in combination therapy with conventional cancer therapies. The safety of the approach has been tested in many clinical trials. However, antitumor efficacy needs to be improved in order to establish oncolytic viruses as a viable treatment alternative. To be able to test in vivo the effects on anti-tumor efficiency of a multimodal combination therapy of oncolytic adenoviruses with the standard therapeutic combination of radiotherapy, chemotherapy and Cetuximab monoclonal antibody (mAb), a xenograft HNC tumor model was developed. This model mimics the typical clinical situation as it is initially sensitive to cetuximab, but resistance develops eventually. Surprisingly, but in agreement with recent findings for chemotherapy and radiotherapy, a higher proportion of cells positive for HNC cancer stem cell markers were found in the tumors refractory to cetuximab. In vitro as well as in vivo results found in this study support the multimodal combination therapy of oncolytic adenoviruses with chemotherapy, radiotherapy and monoclonal antibody therapy to achieve increased anti-tumor efficiency and even complete tumor eradication with lower treatment doses required. In this study, it was found that capsid modified oncolytic viruses have increased gene transfer to cancer cells as well as an increased antitumor effect. In order to elucidate the mechanism of how oncolytic viruses promote radiosensitization of tumor cells in vivo, replicative deficient viruses expressing several promising radiosensitizing viral proteins were tested. The results of this study indicated that oncolytic adenoviruses promote radiosensitization by delaying the repair of DNA double strand breaks in tumor cells. Based on the promising data of the first study, two tumor double-targeted oncolytic adenoviruses armed with the fusion suicide gene FCU1 or with a fully human mAb specific for human Cytotoxic T Lymphocyte-Associated Antigen 4 (CTLA-4) were produced. FCU1 encodes a bifunctional fusion protein that efficiently catalyzes the direct conversion of 5-FC, a relatively nontoxic antifungal agent, into the toxic metabolites 5-fluorouracil and 5-fluorouridine monophosphate, bypassing the natural resistance of certain human tumor cells to 5-fluorouracil. Anti-CTLA4 mAb promotes direct killing of tumor cells via apoptosis and most importantly immune system activation against the tumors. These armed oncolytic viruses present increased anti-tumor efficacy both in vitro and in vivo. Furthermore, by taking advantage of the unique tumor targeted gene transfer of oncolytic adenoviruses, functional high tumor titers but low systemic concentrations of the armed proteins were generated. In addition, supernatants of tumor cells infected with Ad5/3-24aCTLA4, which contain anti-CTLA4 mAb, were able to effectively immunomodulate peripheral blood mononuclear cells (PBMC) of cancer patients with advanced tumors. -- In conclusion, the results presented in this thesis suggest that genetically engineered oncolytic adenoviruses have great potential in the treatment of advanced and metastatic HNC and PC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In dentistry, basic imaging techniques such as intraoral and panoramic radiography are in most cases the only imaging techniques required for the detection of pathology. Conventional intraoral radiographs provide images with sufficient information for most dental radiographic needs. Panoramic radiography produces a single image of both jaws, giving an excellent overview of oral hard tissues. Regardless of the technique, plain radiography has only a limited capability in the evaluation of three-dimensional (3D) relationships. Technological advances in radiological imaging have moved from two-dimensional (2D) projection radiography towards digital, 3D and interactive imaging applications. This has been achieved first by the use of conventional computed tomography (CT) and more recently by cone beam CT (CBCT). CBCT is a radiographic imaging method that allows accurate 3D imaging of hard tissues. CBCT has been used for dental and maxillofacial imaging for more than ten years and its availability and use are increasing continuously. However, at present, only best practice guidelines are available for its use, and the need for evidence-based guidelines on the use of CBCT in dentistry is widely recognized. We evaluated (i) retrospectively the use of CBCT in a dental practice, (ii) the accuracy and reproducibility of pre-implant linear measurements in CBCT and multislice CT (MSCT) in a cadaver study, (iii) prospectively the clinical reliability of CBCT as a preoperative imaging method for complicated impacted lower third molars, and (iv) the tissue and effective radiation doses and image quality of dental CBCT scanners in comparison with MSCT scanners in a phantom study. Using CBCT, subjective identification of anatomy and pathology relevant in dental practice can be readily achieved, but dental restorations may cause disturbing artefacts. CBCT examination offered additional radiographic information when compared with intraoral and panoramic radiographs. In terms of the accuracy and reliability of linear measurements in the posterior mandible, CBCT is comparable to MSCT. CBCT is a reliable means of determining the location of the inferior alveolar canal and its relationship to the roots of the lower third molar. CBCT scanners provided adequate image quality for dental and maxillofacial imaging while delivering considerably smaller effective doses to the patient than MSCT. The observed variations in patient dose and image quality emphasize the importance of optimizing the imaging parameters in both CBCT and MSCT.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Matrix metalloproteinase (MMP) -8, collagenase-2, is a key mediator of irreversible tissue destruction in chronic periodontitis and detectable in gingival crevicular fluid (GCF). MMP-8 mostly originates from neutrophil leukocytes, the first line of defence cells which exist abundantly in GCF, especially in inflammation. MMP-8 is capable of degrading almost all extra-cellular matrix and basement membrane components and is especially efficient against type I collagen. Thus the expression of MMP-8 in GCF could be valuable in monitoring the activity of periodontitis and possibly offers a diagnostic means to predict progression of periodontitis. In this study the value of MMP-8 detection from GCF in monitoring of periodontal health and disease was evaluated with special reference to its ability to differentiate periodontal health and different disease states of the periodontium and to recognise the progression of periodontitis, i.e. active sites. For chair-side detection of MMP-8 from the GCF or peri-implant sulcus fluid (PISF) samples, a dip-stick test based on immunochromatography involving two monoclonal antibodies was developed. The immunoassay for the detection of MMP-8 from GCF was found to be more suitable for monitoring of periodontitis than detection of GCF elastase concentration or activity. Periodontally healthy subjects and individuals suffering of gingivitis or of periodontitis could be differentiated by means of GCF MMP-8 levels and dipstick testing when the positive threshold value of the MMP-8 chair-side test was set at 1000 µg/l. MMP-8 dipstick test results from periodontally healthy and from subjects with gingivitis were mainly negative while periodontitis patients sites with deep pockets ( 5 mm) and which were bleeding on probing were most often test positive. Periodontitis patients GCF MMP-8 levels decreased with hygiene phase periodontal treatment (scaling and root planing, SRP) and even reduced during the three month maintenance phase. A decrease in GCF MMP-8 levels could be monitored with the MMP-8 test. Agreement between the test stick and the quantitative assay was very good (κ = 0.81) and the test provided a baseline sensitivity of 0.83 and specificity of 0.96. During the 12-month longitudinal maintenance phase, periodontitis patients progressing sites (sites with an increase in attachment loss ≥ 2 mm during the maintenance phase) had elevated GCF MMP-8 levels compared with stable sites. General mean MMP-8 concentrations in smokers (S) sites were lower than in non-smokers (NS) sites but in progressing S and NS sites concentrations were at an equal level. Sites with exceptionally and repeatedly elevated MMP-8 concentrations during the maintenance phase were clustered in smoking patients with poor response to SRP (refractory patients). These sites especially were identified by the MMP-8 test. Subgingival plaque samples from periodontitis patients deep periodontal pockets were examined by polymerase chain reaction (PCR) to find out if periodontal lesions may serve as a niche for Chlamydia pneumoniae. Findings were compared with the clinical periodontal parameters and GCF MMP-8 levels to determine the correlation with periodontal status. Traces of C. pneumoniae were identified from one periodontitis patient s pooled subgingival plaque sample by means of PCR. After periodontal treatment (SRP) the sample was negative for C. pneumoniae. Clinical parameters or biomarkers (MMP-8) of the patient with the positive C. pneumoniae finding did not differ from other study patients. In this study it was concluded that MMP-8 concentrations in GCF of sites from periodontally healthy individuals, subjects with gingivitis or with periodontitis are at different levels. The cut-off value of the developed MMP-8 test is at an optimal level to differentiate between these conditions and can possibly be utilised in identification of individuals at the risk of the transition of gingivitis to periodontitis. In periodontitis patients, repeatedly elevated GCF MMP-8 concentrations may indicate sites at risk of progression of periodontitis as well as patients with poor response to conventional periodontal treatment (SRP). This can be monitored by MMP-8 testing. Despite the lower mean GCF MMP-8 concentrations in smokers, a fraction of smokers sites expressed very high MMP-8 concentrations together with enhanced periodontal activity and could be identified with MMP-8 specific chair-side test. Deep periodontal lesions may be niches for non-periodontopathogenic micro-organisms with systemic effects like C. pneumoniae and possibly play a role in the transmission from one subject to another.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There exists various suggestions for building a functional and a fault-tolerant large-scale quantum computer. Topological quantum computation is a more exotic suggestion, which makes use of the properties of quasiparticles manifest only in certain two-dimensional systems. These so called anyons exhibit topological degrees of freedom, which, in principle, can be used to execute quantum computation with intrinsic fault-tolerance. This feature is the main incentive to study topological quantum computation. The objective of this thesis is to provide an accessible introduction to the theory. In this thesis one has considered the theory of anyons arising in two-dimensional quantum mechanical systems, which are described by gauge theories based on so called quantum double symmetries. The quasiparticles are shown to exhibit interactions and carry quantum numbers, which are both of topological nature. Particularly, it is found that the addition of the quantum numbers is not unique, but that the fusion of the quasiparticles is described by a non-trivial fusion algebra. It is discussed how this property can be used to encode quantum information in a manner which is intrinsically protected from decoherence and how one could, in principle, perform quantum computation by braiding the quasiparticles. As an example of the presented general discussion, the particle spectrum and the fusion algebra of an anyon model based on the gauge group S_3 are explicitly derived. The fusion algebra is found to branch into multiple proper subalgebras and the simplest one of them is chosen as a model for an illustrative demonstration. The different steps of a topological quantum computation are outlined and the computational power of the model is assessed. It turns out that the chosen model is not universal for quantum computation. However, because the objective was a demonstration of the theory with explicit calculations, none of the other more complicated fusion subalgebras were considered. Studying their applicability for quantum computation could be a topic of further research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is an experiment at CERN (European Organization for Nuclear Research), where a heavy-ion detector is dedicated to exploit the unique physics potential of nucleus-nucleus interactions at LHC (Large Hadron Collider) energies. In a part of that project, 716 so-called type V4 modules were assembles in Detector Laboratory of Helsinki Institute of Physics during the years 2004 - 2006. Altogether over a million detector strips has made this project the most massive particle detector project in the science history of Finland. One ALICE SSD module consists of a double-sided silicon sensor, two hybrids containing 12 HAL25 front end readout chips and some passive components, such has resistors and capacitors. The components are connected together by TAB (Tape Automated Bonding) microcables. The components of the modules were tested in every assembly phase with comparable electrical tests to ensure the reliable functioning of the detectors and to plot the possible problems. The components were accepted or rejected by the limits confirmed by ALICE collaboration. This study is concentrating on the test results of framed chips, hybrids and modules. The total yield of the framed chips is 90.8%, hybrids 96.1% and modules 86.2%. The individual test results have been investigated in the light of the known error sources that appeared during the project. After solving the problems appearing during the learning-curve of the project, the material problems, such as defected chip cables and sensors, seemed to induce the most of the assembly rejections. The problems were typically seen in tests as too many individual channel failures. Instead, the bonding failures rarely caused the rejections of any component. One sensor type among three different sensor manufacturers has proven to have lower quality than the others. The sensors of this manufacturer are very noisy and their depletion voltage are usually outside of the specification given to the manufacturers. Reaching 95% assembling yield during the module production demonstrates that the assembly process has been highly successful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Continuous epidural analgesia (CEA) and continuous spinal postoperative analgesia (CSPA) provided by a mixture of local anaesthetic and opioid are widely used for postoperative pain relief. E.g., with the introduction of so-called microcatheters, CSPA found its way particularly in orthopaedic surgery. These techniques, however, may be associated with dose-dependent side-effects as hypotension, weakness in the legs, and nausea and vomiting. At times, they may fail to offer sufficient analgesia, e.g., because of a misplaced catheter. The correct position of an epidural catheter might be confirmed by the supposedly easy and reliable epidural stimulation test (EST). The aims of this thesis were to determine a) whether the efficacy, tolerability, and reliability of CEA might be improved by adding the α2-adrenergic agonists adrenaline and clonidine to CEA, and by the repeated use of EST during CEA; and, b) the feasibility of CSPA given through a microcatheter after vascular surgery. Studies I IV were double-blinded, randomized, and controlled trials; Study V was of a diagnostic, prospective nature. Patients underwent arterial bypass surgery of the legs (I, n=50; IV, n=46), total knee arthroplasty (II, n=70; III, n=72), and abdominal surgery or thoracotomy (V, n=30). Postoperative lumbar CEA consisted of regular mixtures of ropivacaine and fentanyl either without or with adrenaline (2 µg/ml (I) and 4 µg/ml (II)) and clonidine (2 µg/ml (III)). CSPA (IV) was given through a microcatheter (28G) and contained either ropivacaine (max. 2 mg/h) or a mixture of ropivacaine (max. 1 mg/h) and morphine (max. 8 µg/h). Epidural catheter tip position (V) was evaluated both by EST at the moment of catheter placement and several times during CEA, and by epidurography as reference diagnostic test. CEA and CSPA were administered for 24 or 48 h. Study parameters included pain scores assessed with a visual analogue scale, requirements of rescue pain medication, vital signs, and side-effects. Adrenaline (I and II) had no beneficial influence as regards the efficacy or tolerability of CEA. The total amounts of epidurally-infused drugs were even increased in the adrenaline group in Study II (p=0.02, RM ANOVA). Clonidine (III) augmented pain relief with lowered amounts of epidurally infused drugs (p=0.01, RM ANOVA) and reduced need for rescue oxycodone given i.m. (p=0.027, MW-U; median difference 3 mg (95% CI 0 7 mg)). Clonidine did not contribute to sedation and its influence on haemodynamics was minimal. CSPA (IV) provided satisfactory pain relief with only limited blockade of the legs (no inter-group differences). EST (V) was often related to technical problems and difficulties of interpretation, e.g., it failed to identify the four patients whose catheters were outside the spinal canal already at the time of catheter placement. As adjuvants to lumbar CEA, clonidine only slightly improved pain relief, while adrenaline did not provide any benefit. The role of EST applied at the time of epidural catheter placement or repeatedly during CEA remains open. The microcatheter CSPA technique appeared effective and reliable, but needs to be compared to routine CEA after peripheral arterial bypass surgery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: To evaluate the applicability of visual feedback posturography (VFP) for quantification of postural control, and to characterize the horizontal angular vestibulo-ocular reflex (AVOR) by use of a novel motorized head impulse test (MHIT). Methods: In VFP, subjects standing on a platform were instructed to move their center of gravity to symmetrically placed peripheral targets as fast and accurately as possible. The active postural control movements were measured in healthy subjects (n = 23), and in patients with vestibular schwannoma (VS) before surgery (n = 49), one month (n = 17), and three months (n = 36) after surgery. In MHIT we recorded head and eye position during motorized head impulses (mean velocity of 170º/s and acceleration of 1 550º/s²) in healthy subjects (n = 22), in patients with VS before surgery (n = 38) and about four months afterwards (n = 27). The gain, asymmetry and latency in MHIT were calculated. Results: The intraclass correlation coefficient for VFP parameters during repeated tests was significant (r = 0.78-0.96; p < 0.01), although two of four VFP parameters improved slightly during five test sessions in controls. At least one VFP parameter was abnormal pre- and postoperatively in almost half the patients, and these abnormal preoperative VFP results correlated significantly with abnormal postoperative results. The mean accuracy in postural control in patients was reduced pre- and postoperatively. A significant side difference with VFP was evident in 10% of patients. In the MHIT, the normal gain was close to unity, the asymmetry in gain was within 10%, and the latency was a mean ± standard deviation 3.4 ± 6.3 milliseconds. Ipsilateral gain or asymmetry in gain was preoperatively abnormal in 71% of patients, whereas it was abnormal in every patient after surgery. Preoperative gain (mean ± 95% confidence interval) was significantly lowered to 0.83 ± 0.08 on the ipsilateral side compared to 0.98 ± 0.06 on the contralateral side. The ipsilateral postoperative mean gain of 0.53 ± 0.05 was significantly different from preoperative gain. Conclusion: The VFP is a repeatable, quantitative method to assess active postural control within individual subjects. The mean postural control in patients with VS was disturbed before and after surgery, although not severely. Side difference in postural control in the VFP was rare. The horizontal AVOR results in healthy subjects and in patients with VS, measured with MHIT, were in agreement with published data achieved using other techniques with head impulse stimuli. The MHIT is a non-invasive method which allows reliable clinical assessment of the horizontal AVOR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The metabolic syndrome and type 1 diabetes are associated with brain alterations such as cognitive decline brain infarctions, atrophy, and white matter lesions. Despite the importance of these alterations, their pathomechanism is still poorly understood. This study was conducted to investigate brain glucose and metabolites in healthy individuals with an increased cardiovascular risk and in patients with type 1 diabetes in order to discover more information on the nature of the known brain alterations. We studied 43 20- to 45-year-old men. Study I compared two groups of non-diabetic men, one with an accumulation of cardiovascular risk factors and another without. Studies II to IV compared men with type 1 diabetes (duration of diabetes 6.7 ± 5.2 years, no microvascular complications) with non-diabetic men. Brain glucose, N-acetylaspartate (NAA), total creatine (tCr), choline, and myo-inositol (mI) were quantified with proton magnetic resonance spectroscopy in three cerebral regions: frontal cortex, frontal white matter, thalamus, and in cerebellar white matter. Data collection was performed for all participants during fasting glycemia and in a subgroup (Studies III and IV), also during a hyperglycemic clamp that increased plasma glucose concentration by 12 mmol/l. In non-diabetic men, the brain glucose concentration correlated linearly with plasma glucose concentration. The cardiovascular risk group (Study I) had a 13% higher plasma glucose concentration than the control group, but no difference in thalamic glucose content. The risk group thus had lower thalamic glucose content than expected. They also had 17% increased tCr (marker of oxidative metabolism). In the control group, tCr correlated with thalamic glucose content, but in the risk group, tCr correlated instead with fasting plasma glucose and 2-h plasma glucose concentration in the oral glucose tolerance test. Risk factors of the metabolic syndrome, most importantly insulin resistance, may thus influence brain metabolism. During fasting glycemia (Study II), regional variation in the cerebral glucose levels appeared in the non-diabetic subjects but not in those with diabetes. In diabetic patients, excess glucose had accumulated predominantly in the white matter where the metabolite alterations were also the most pronounced. Compared to the controls values, the white matter NAA (marker of neuronal metabolism) was 6% lower and mI (glia cell marker) 20% higher. Hyperglycemia is therefore a potent risk factor for diabetic brain disease and the metabolic brain alterations may appear even before any peripheral microvascular complications are detectable. During acute hyperglycemia (Study III), the increase in cerebral glucose content in the patients with type 1 diabetes was, dependent on brain region, between 1.1 and 2.0 mmol/l. An every-day hyperglycemic episode in a diabetic patient may therefore as much as double brain glucose concentration. While chronic hyperglycemia had led to accumulation of glucose in the white matter, acute hyperglycemia burdened predominantly the gray matter. Acute hyperglycemia also revealed that chronic fluctuation in blood glucose may be associated with alterations in glucose uptake or in metabolism in the thalamus. The cerebellar white matter appeared very differently from the cerebral (Study IV). In the non-diabetic men it contained twice as much glucose as the cerebrum. Diabetes had altered neither its glucose content nor the brain metabolites. The cerebellum seems therefore more resistant to the effects of hyperglycemia than is the cerebrum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radiation therapy (RT) plays currently significant role in curative treatments of several cancers. External beam RT is carried out mostly by using megavoltage beams of linear accelerators. Tumor eradication and normal tissue complications correlate to dose absorbed in tissues. Normally this dependence is steep and it is crucial that actual dose within patient accurately correspond to the planned dose. All factors in a RT procedure contain uncertainties requiring strict quality assurance. From hospital physicist´s point of a view, technical quality control (QC), dose calculations and methods for verification of correct treatment location are the most important subjects. Most important factor in technical QC is the verification that radiation production of an accelerator, called output, is within narrow acceptable limits. The output measurements are carried out according to a locally chosen dosimetric QC program defining measurement time interval and action levels. Dose calculation algorithms need to be configured for the accelerators by using measured beam data. The uncertainty of such data sets limits for best achievable calculation accuracy. All these dosimetric measurements require good experience, are workful, take up resources needed for treatments and are prone to several random and systematic sources of errors. Appropriate verification of treatment location is more important in intensity modulated radiation therapy (IMRT) than in conventional RT. This is due to steep dose gradients produced within or close to healthy tissues locating only a few millimetres from the targeted volume. The thesis was concentrated in investigation of the quality of dosimetric measurements, the efficacy of dosimetric QC programs, the verification of measured beam data and the effect of positional errors on the dose received by the major salivary glands in head and neck IMRT. A method was developed for the estimation of the effect of the use of different dosimetric QC programs on the overall uncertainty of dose. Data were provided to facilitate the choice of a sufficient QC program. The method takes into account local output stability and reproducibility of the dosimetric QC measurements. A method based on the model fitting of the results of the QC measurements was proposed for the estimation of both of these factors. The reduction of random measurement errors and optimization of QC procedure were also investigated. A method and suggestions were presented for these purposes. The accuracy of beam data was evaluated in Finnish RT centres. Sufficient accuracy level was estimated for the beam data. A method based on the use of reference beam data was developed for the QC of beam data. Dosimetric and geometric accuracy requirements were evaluated for head and neck IMRT when function of the major salivary glands is intended to be spared. These criteria are based on the dose response obtained for the glands. Random measurement errors could be reduced enabling lowering of action levels and prolongation of measurement time interval from 1 month to even 6 months simultaneously maintaining dose accuracy. The combined effect of the proposed methods, suggestions and criteria was found to facilitate the avoidance of maximal dose errors of up to even about 8 %. In addition, their use may make the strictest recommended overall dose accuracy level of 3 % (1SD) achievable.