838 resultados para Methods and Techniques
Resumo:
In this thesis, two separate single nucleotide polymorphism (SNP) genotyping techniques were set up at the Finnish Genome Center, pooled genotyping was evaluated as a screening method for large-scale association studies, and finally, the former approaches were used to identify genetic factors predisposing to two distinct complex diseases by utilizing large epidemiological cohorts and also taking environmental factors into account. The first genotyping platform was based on traditional but improved restriction-fragment-length-polymorphism (RFLP) utilizing 384-microtiter well plates, multiplexing, small reaction volumes (5 µl), and automated genotype calling. We participated in the development of the second genotyping method, based on single nucleotide primer extension (SNuPeTM by Amersham Biosciences), by carrying out the alpha- and beta tests for the chemistry and the allele-calling software. Both techniques proved to be accurate, reliable, and suitable for projects with thousands of samples and tens of markers. Pooled genotyping (genotyping of pooled instead of individual DNA samples) was evaluated with Sequenom s MassArray MALDI-TOF, in addition to SNuPeTM and PCR-RFLP techniques. We used MassArray mainly as a point of comparison, because it is known to be well suited for pooled genotyping. All three methods were shown to be accurate, the standard deviations between measurements being 0.017 for the MassArray, 0.022 for the PCR-RFLP, and 0.026 for the SNuPeTM. The largest source of error in the process of pooled genotyping was shown to be the volumetric error, i.e., the preparation of pools. We also demonstrated that it would have been possible to narrow down the genetic locus underlying congenital chloride diarrhea (CLD), an autosomal recessive disorder, by using the pooling technique instead of genotyping individual samples. Although the approach seems to be well suited for traditional case-control studies, it is difficult to apply if any kind of stratification based on environmental factors is needed. Therefore we chose to continue with individual genotyping in the following association studies. Samples in the two separate large epidemiological cohorts were genotyped with the PCR-RFLP and SNuPeTM techniques. The first of these association studies concerned various pregnancy complications among 100,000 consecutive pregnancies in Finland, of which we genotyped 2292 patients and controls, in addition to a population sample of 644 blood donors, with 7 polymorphisms in the potentially thrombotic genes. In this thesis, the analysis of a sub-study of pregnancy-related venous thromboses was included. We showed that the impact of factor V Leiden polymorphism on pregnancy-related venous thrombosis, but not the other tested polymorphisms, was fairly large (odds ratio 11.6; 95% CI 3.6-33.6), and increased multiplicatively when combined with other risk factors such as obesity or advanced age. Owing to our study design, we were also able to estimate the risks at the population level. The second epidemiological cohort was the Helsinki Birth Cohort of men and women who were born during 1924-1933 in Helsinki. The aim was to identify genetic factors that might modify the well known link between small birth size and adult metabolic diseases, such as type 2 diabetes and impaired glucose tolerance. Among ~500 individuals with detailed birth measurements and current metabolic profile, we found that an insertion/deletion polymorphism of the angiotensin converting enzyme (ACE) gene was associated with the duration of gestation, and weight and length at birth. Interestingly, the ACE insertion allele was also associated with higher indices of insulin secretion (p=0.0004) in adult life, but only among individuals who were born small (those among the lowest third of birth weight). Likewise, low birth weight was associated with higher indices of insulin secretion (p=0.003), but only among carriers of the ACE insertion allele. The association with birth measurements was also found with a common haplotype of the glucocorticoid receptor (GR) gene. Furthermore, the association between short length at birth and adult impaired glucose tolerance was confined to carriers of this haplotype (p=0.007). These associations exemplify the interaction between environmental factors and genotype, which, possibly due to altered gene expression, predisposes to complex metabolic diseases. Indeed, we showed that the common GR gene haplotype associated with reduced mRNA expression in thymus of three individuals (p=0.0002).
Resumo:
During the last 10-15 years interest in mouse behavioural analysis has evolved considerably. The driving force is development in molecular biological techniques that allow manipulation of the mouse genome by changing the expression of genes. Therefore, with some limitations it is possible to study how genes participate in regulation of physiological functions and to create models explaining genetic contribution to various pathological conditions. The first aim of our study was to establish a framework for behavioural phenotyping of genetically modified mice. We established comprehensive battery of tests for the initial screening of mutant mice. These included tests for exploratory and locomotor activity, emotional behaviour, sensory functions, and cognitive performance. Our interest was in the behavioural patterns of common background strains used for genetic manipulations in mice. Additionally we studied the behavioural effect of sex differences, test history, and individual housing. Our findings highlight the importance of careful consideration of genetic background for analysis of mutant mice. It was evident that some backgrounds may mask or modify the behavioural phenotype of mutants and thereby lead to false positive or negative findings. Moreover, there is no universal strain that is equally suitable for all tests, and using different backgrounds allows one to address possible phenotype modifying factors. We discovered that previous experience affected performance in several tasks. The most sensitive traits were the exploratory and emotional behaviour, as well as motor and nociceptive functions. Therefore, it may be essential to repeat some of the tests in naïve animals for assuring the phenotype. Social isolation for a long time period had strong effects on exploratory behaviour, but also on learning and memory. All experiments revealed significant interactions between strain and environmental factors (test history or housing condition) indicating genotype-dependent effects of environmental manipulations. Several mutant line analyses utilize this information. For example, we studied mice overexpressing as well as those lacking extracellular matrix protein heparin-binding growth-associated molecule (HB-GAM), and mice lacking N-syndecan (a receptor for HB-GAM). All mutant mice appeared to be fertile and healthy, without any apparent neurological or sensory defects. The lack of HB-GAM and N-syndecan, however, significantly reduced the learning capacity of the mice. On the other hand, overexpression of HB-GAM resulted in facilitated learning. Moreover, HB-GAM knockout mice displayed higher anxiety-like behaviour, whereas anxiety was reduced in HB-GAM overexpressing mice. Changes in hippocampal plasticity accompanied the behavioural phenotypes. We conclude that HB-GAM and N-syndecan are involved in the modulation of synaptic plasticity in hippocampus and play a role in regulation of anxiety- and learning-related behaviour.
Resumo:
The prognosis of patients with glioblastoma, the most malignant adult glial brain tumor, remains poor in spite of advances in treatment procedures, including surgical resection, irradiation and chemotherapy.Genetic heterogeneity of glioblastoma warrants extensive studies in order to gain a thorough understanding of the biology of this tumor. While there have been several studies of global transcript profiling of glioma with the identification of gene signatures for diagnosis and disease management, translation into clinics is yet to happen. Serum biomarkers have the potential to revolutionize the process of cancer diagnosis, grading, prognostication and treatment response monitoring. Besides having the advantage that serum can be obtained through a less invasive procedure, it contains molecules at an extraordinary dynamic range of ten orders of magnitude in terms of their concentrations. While the conventional methods, such as 2DE, have been in use for many years, the ability to identify the proteins through mass spectrometry techniques such as MALDI-TOF led to an explosion of interest in proteomics. Relatively new high-throughput proteomics methods such as SELDI-TOF and protein microarrays are expected to hasten the process of serum biomarker discovery. This review will highlight the recent advances in the proteomics platform in discovering serum biomarkers and the current status of glioma serum markers. We aim to provide the principles and potential of the latest proteomic approaches and their applications in the biomarker discovery process. Besides providing a comprehensive list of available serum biomarkers of glioma, we will also propose how these markers will revolutionize the clinical management of glioma patients.
Resumo:
La0.5Li0.5TiO3 perovskite was synthesized by various wet chemical methods. By adopting low temperature methods of preparation lithium loss from the material is prevented. La0.5Li0.5TiO3 (LLTO) was formed with cubic symmetry at 1473 K. LLTO was formed at relatively lower temperature by using hydrothermal preparation method. PVA gel-decomposition route yield tetragonal LLTO on annealing the dried gel at 1473 K. By using gel-carbonate route LiTi2O4 minor phase was found to remain even after heat-treatment at 1473 K. The hydroxylation of LLTO was done in deionized water as well as in dilute acetic acid medium. By hydroxylation process incorporation of hydroxyls and leaching out of Li+ was observed from the material. The Li+ concentration of these compositions was examined by AAS. The electrical conductivities of these compositions were measured by dc and ac impedance techniques at elevated temperatures. The activation energies of electrical conduction for these compositions were estimated from the experimental results. The measured activation energy of Li+ conduction is 0.34 eV. Unhydroxylated samples exhibit only Li+ conduction, whereas, the hydroxylated LLTO show proton conductivity at 298-550 K in addition to Li+ conductivity. The effect of Zr or Ce substitution in place of Ti were attempted. La0.5Li0.5ZrO3 Perovskite was not formed; instead pyrochlore phase (La2Zr2O7) along with monoclinic ZrO2 phases was observed above 1173 K; below 1173 K cubic ZrO2 is stable. (La0.5Li0.5)(2)CeO4 solid solution was formed in the case of Ce substitution at Ti sublattice on heat-treatment up to 1673 K. (c) 2005 Springer Science + Business Media, Inc.
Resumo:
Functional Electrical Stimulation (FES) is a technique that consists on applying electrical current pulses to artificially activate motor nerve fibers and produce muscle contractions to achieve functional movements. The main applications of FES are within the rehabilitation field, in which this technique is used to aid recovery or to restore lost motor functions. People that benefit of FES are usually patients with neurological disorders which result in motor dysfunctions; most common patients include stroke and spinal cord injury (SCI). Neuroprosthesis are devices that have their basis in FES technique, and their aim is to bridge interrupted or damaged neural paths between the brain and upper or lower limbs. One of the aims of neuroprosthesis is to artificially generate muscle contractions that produce functional movements, and therefore, assist impaired people by making them able to perform activities of daily living (ADL). FES applies current pulses and stimulates nerve fibers by means of electrodes, which can be either implanted or surface electrodes. Both of them have advantages and disadvantages. Implanted electrodes need open surgery to place them next to the nerve root, so these electrodes carry many disadvantages that are produced by the use of invasive techniques. In return, as the electrodes are attached to the nerve, they make it easier to achieve selective functional movements. On the contrary, surface electrodes are not invasive and are easily attached or detached on the skin. Main disadvantages of surface electrodes are the difficulty of selectively stimulating nerve fibers and uncomfortable feeling perceived by users due to sensory nerves located in the skin. Electrical stimulation surface electrode technology has improved significantly through the years and recently, multi-field electrodes have been suggested. This multi-field or matrix electrode approach brings many advantages to FES; among them it is the possibility of easily applying different stimulation methods and techniques. The main goal of this thesis is therefore, to test two stimulation methods, which are asynchronous and synchronous stimulation, in the upper limb with multi-field electrodes. To this end, a purpose-built wrist torque measuring system and a graphic user interface were developed to measure wrist torque produced with each of the methods and to efficiently carry out the experiments. Then, both methods were tested on 15 healthy subjects and sensitivity results were analyzed for different cases. Results show that there are significant differences between methods regarding sensation in some cases, which can affect effectiveness or success of FES.
Resumo:
This technical report describes a practical method consisting of a checklist and a supporting techniques for those planning or just starting to develop or select design tools and methods. The method helps to summarize and illustrate the envisaged tool or method by identifying its scope and the underlying assumptions. The resulting tool or method description clarifies the problem that is addressed, the approach and the possible implications, and can thus be used by a variety of people involved in assessing a tool or method in an early stage. For the developers themselves the method reveals how realistic the envisaged method or tool is, and whether the scope has to be narrowed.
Resumo:
Wind power generation differs from conventional thermal generation due to the stochastic nature of wind. Thus wind power forecasting plays a key role in dealing with the challenges of balancing supply and demand in any electricity system, given the uncertainty associated with the wind farm power output. Accurate wind power forecasting reduces the need for additional balancing energy and reserve power to integrate wind power. Wind power forecasting tools enable better dispatch, scheduling and unit commitment of thermal generators, hydro plant and energy storage plant and more competitive market trading as wind power ramps up and down on the grid. This paper presents an in-depth review of the current methods and advances in wind power forecasting and prediction. Firstly, numerical wind prediction methods from global to local scales, ensemble forecasting, upscaling and downscaling processes are discussed. Next the statistical and machine learning approach methods are detailed. Then the techniques used for benchmarking and uncertainty analysis of forecasts are overviewed, and the performance of various approaches over different forecast time horizons is examined. Finally, current research activities, challenges and potential future developments are appraised.
Resumo:
The US National Oceanic and Atmospheric Administration (NOAA) Fisheries Continuous Plankton Recorder (CPR) Survey has sampled four routes: Boston–Nova Scotia (1961–present), New York toward Bermuda (1976–present), Narragansett Bay–Mount Hope Bay–Rhode Island Sound (1998–present) and eastward of Chesapeake Bay (1974–1980). NOAA involvement began in 1974 when it assumed responsibility for the existing Boston–Nova Scotia route from what is now the UK's Sir Alister Hardy Foundation for Ocean Science (SAHFOS). Training, equipment and computer software were provided by SAHFOS to ensure continuity for this and standard protocols for any new routes. Data for the first 14 years of this route were provided to NOAA by SAHFOS. Comparison of collection methods; sample processing; and sample identification, staging and counting techniques revealed near-consistency between NOAA and SAHFOS. One departure involved phytoplankton counting standards. This has since been addressed and the data corrected. Within- and between-survey taxonomic and life-stage names and their consistency through time were, and continue to be, an issue. For this, a cross-reference table has been generated that contains the SAHFOS taxonomic code, NOAA taxonomic code, NOAA life-stage code, National Oceanographic Data Center (NODC) taxonomic code, Integrated Taxonomic Information System (ITIS) serial number and authority and consistent use/route. This table is available for review/use by other CPR surveys. Details of the NOAA and SAHFOS comparison and analytical techniques unique to NOAA are presented.
Resumo:
Roche tomography is a technique used for imaging the Roche-lobe-filling secondary stars in cataclysmic variables (CVs). In order to interpret Roche tomograms correctly, one must determine whether features in the reconstruction are real, or the result of statistical or systematic errors. We explore the effects of systematic errors using reconstructions of simulated data sets, and show that systematic errors result in characteristic distortions of the final reconstructions that can be identified and corrected. In addition, we present a new method of estimating statistical errors on tomographic reconstructions using a Monte Carlo bootstrapping algorithm, and show this method to be much more reliable than Monte Carlo methods which 'jiggle' the data points in accordance with the size of their error bars.
Resumo:
In the past few decades, a growing body of literature examining children’s perspectives on their own lives has developed within a variety of disciplines, such as sociology, psychology, anthropology and geography. This article provides a brief up-to-date examination of methodological and ethical issues that researchers may need to consider when designing research studies involving children; and a review of some of the methods and techniques used to elicit their views. The article aims to encourage researchers to critically reflect on these methodological issues and the techniques they choose to use, since they will have implications for the data produced.
Resumo:
Wind power generation differs from conventional thermal generation due to the stochastic nature of wind. Thus wind power forecasting plays a key role in dealing with the challenges of balancing supply and demand in any electricity system, given the uncertainty associated with the wind farm power output. Accurate wind power forecasting reduces the need for additional balancing energy and reserve power to integrate wind power. Wind power forecasting tools enable better dispatch, scheduling and unit commitment of thermal generators, hydro plant and energy storage plant and more competitive market trading as wind power ramps up and down on the grid. This paper presents an in-depth review of the current methods and advances in wind power forecasting and prediction. Firstly, numerical wind prediction methods from global to local scales, ensemble forecasting, upscaling and downscaling processes are discussed. Next the statistical and machine learning approach methods are detailed. Then the techniques used for benchmarking and uncertainty analysis of forecasts are overviewed, and the performance of various approaches over different forecast time horizons is examined. Finally, current research activities, challenges and potential future developments are appraised. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Background: Medical Research Council (MRC) guidelines recommend applying theory within complex interventions to explain how behaviour change occurs. Guidelines endorse self-management of chronic low back pain (CLBP) and osteoarthritis (OA), but evidence for its effectiveness is weak. Objective: This literature review aimed to determine the use of behaviour change theory and techniques within randomised controlled trials of group-based self-management programmes for chronic musculoskeletal pain, specifically CLBP and OA. Methods: A two-phase search strategy of electronic databases was used to identify systematic reviews and studies relevant to this area. Articles were coded for their use of behaviour change theory, and the number of behaviour change techniques (BCTs) was identified using a 93-item taxonomy, Taxonomy (v1). Results: 25 articles of 22 studies met the inclusion criteria, of which only three reported having based their intervention on theory, and all used Social Cognitive Theory. A total of 33 BCTs were coded across all articles with the most commonly identified techniques being '. instruction on how to perform the behaviour', '. demonstration of the behaviour', '. behavioural practice', '. credible source', '. graded tasks' and '. body changes'. Conclusion: Results demonstrate that theoretically driven research within group based self-management programmes for chronic musculoskeletal pain is lacking, or is poorly reported. Future research that follows recommended guidelines regarding the use of theory in study design and reporting is warranted.
Resumo:
Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.
Resumo:
Several automated reversed-phase HPLC methods have been developed to determine trace concentrations of carbamate pesticides (which are of concern in Ontario environmental samples) in water by utilizing two solid sorbent extraction techniques. One of the methods is known as on-line pre-concentration'. This technique involves passing 100 milliliters of sample water through a 3 cm pre-column, packed with 5 micron ODS sorbent, at flow rates varying from 5-10 mUmin. By the use of a valve apparatus, the HPLC system is then switched to a gradient mobile phase program consisting of acetonitrile and water. The analytes, Propoxur, Carbofuran, Carbaryl, Propham, Captan, Chloropropham, Barban, and Butylate, which are pre-concentrated on the pre-column, are eluted and separated on a 25 cm C-8 analytical column and determined by UV absorption at 220 nm. The total analytical time is 60 minutes, and the pre-column can be used repeatedly for the analysis of as many as thirty samples. The method is highly sensitive as 100 percent of the analytes present in the sample can be injected into the HPLC. No breakthrough of any of the analytes was observed and the minimum detectable concentrations range from 10 to 480 ng/L. The developed method is totally automated for the analysis of one sample. When the above mobile phase is modified with a buffer solution, Aminocarb, Benomyl, and its degradation product, MBC, can also be detected along with the above pesticides with baseline resolution for all of the analytes. The method can also be easily modified to determine Benomyl and MBC both as solute and as particulate matter. By using a commercially available solid phase extraction cartridge, in lieu of a pre-column, for the extraction and concentration of analytes, a completely automated method has been developed with the aid of the Waters Millilab Workstation. Sample water is loaded at 10 mL/min through a cartridge and the concentrated analytes are eluted from the sorbent with acetonitrile. The resulting eluate is blown-down under nitrogen, made up to volume with water, and injected into the HPLC. The total analytical time is 90 minutes. Fifty percent of the analytes present in the sample can be injected into the HPLC, and recoveries for the above eight pesticides ranged from 84 to 93 percent. The minimum detectable concentrations range from 20 to 960 ng/L. The developed method is totally automated for the analysis of up to thirty consecutive samples. The method has proven to be applicable to both purer water samples as well as untreated lake water samples.