39 resultados para reverse Gauss–Seidel method
Resumo:
In the present study, we identified a novel asthma susceptibility gene, NPSR1 (neuropeptide S receptor 1) on chromosome 7p14.3 by the positional cloning strategy. An earlier significant linkage mapping result among Finnish Kainuu asthma families was confirmed in two independent cohorts: in asthma families from Quebec, Canada and in allergy families from North Karelia, Finland. The linkage region was narrowed down to a 133-kb segment by a hierarchial genotyping method. The observed 77-kb haplotype block showed 7 haplotypes and a similar risk and nonrisk pattern in all three populations studied. All seven haplotypes occur in all three populations at frequences > 2%. Significant elevated relative risks were detected for elevated total IgE (immunoglobulin E) or asthma. Risk effects of the gene variants varied from 1.4 to 2.5. NPSR1 belongs to the G protein-coupled receptor (GPCR) family with a topology of seven transmembrane domains. NPSR1 has 9 exons, with the two main transcripts, A and B, encoding proteins of 371 and 377 amino acids, respectively. We detected a low but ubiquitous expression level of NPSR1-B in various tissues and endogenous cell lines while NPSR1-A has a more restricted expression pattern. Both isoforms were expressed in the lung epithelium. We observed aberrant expression levels of NPSR1-B in smooth muscle in asthmatic bronchi as compared to healthy. In an experimental mouse model, the induced lung inflammation resulted in elevated Npsr1 levels. Furthermore, we demonstrated that the activation of NPSR1 with its endogenous agonist, neuropeptide S (NPS), resulted in a significant inhibition of the growth of NPSR1-A overexpressing stable cell lines (NPSR1-A cells). To determine which target genes were regulated by the NPS-NPSR1 pathway, NPSR1-A cells were stimulated with NPS, and differentially expressed genes were identified using the Affymetrix HGU133Plus2 GeneChip. A total of 104 genes were found significantly up-regulated and 42 down-regulated 6 h after NPS administration. The up-regulated genes included many neuronal genes and some putative susceptibility genes for respiratory disorders. By Gene Ontology enrichment analysis, the biological process terms, cell proliferation, morphogenesis and immune response were among the most altered. The expression of four up-regulated genes, matrix metallopeptidase 10 (MMP10), INHBA (activin A), interleukin 8 (IL8) and EPH receptor A2 (EPHA2), were verified and confirmed by quantitative reverse-transcriptase-PCR. In conclusion, we identified a novel asthma susceptibility gene, NPSR1, on chromosome 7p14.3. NPS-NPSR1 represents a novel pathway that regulates cell proliferation and immune responses, and thus may have functional relevance in the pathogenesis of asthma.
Resumo:
The main obstacle for the application of high quality diamond-like carbon (DLC) coatings has been the lack of adhesion to the substrate as the coating thickness is increased. The aim of this study was to improve the filtered pulsed arc discharge (FPAD) method. With this method it is possible to achieve high DLC coating thicknesses necessary for practical applications. The energy of the carbon ions was measured with an optoelectronic time-of-flight method. An in situ cathode polishing system used for stabilizing the process yield and the carbon ion energies is presented. Simultaneously the quality of the coatings can be controlled. To optimise the quality of the deposition process a simple, fast and inexpensive method using silicon wafers as test substrates was developed. This method was used for evaluating the suitability of a simplified arc-discharge set-up for the deposition of the adhesion layer of DLC coatings. A whole new group of materials discovered by our research group, the diamond-like carbon polymer hybrid (DLC-p-h) coatings, is also presented. The parent polymers used in these novel coatings were polydimethylsiloxane (PDMS) and polytetrafluoroethylene (PTFE). The energy of the plasma ions was found to increase when the anode-cathode distance and the arc voltage were increased. A constant deposition rate for continuous coating runs was obtained with an in situ cathode polishing system. The novel DLC-p-h coatings were found to be water and oil repellent and harder than any polymers. The lowest sliding angle ever measured from a solid surface, 0.15 ± 0.03°, was measured on a DLC-PDMS-h coating. In the FPAD system carbon ions can be accelerated to high energies (≈ 1 keV) necessary for the optimal adhesion (the substrate is broken in the adhesion and quality test) of ultra thick (up to 200 µm) DLC coatings by increasing the anode-cathode distance and using high voltages (up to 4 kV). An excellent adhesion can also be obtained with the simplified arc-discharge device. To maintain high process yield (5µm/h over a surface area of 150 cm2) and to stabilize the carbon ion energies and the high quality (sp3 fraction up to 85%) of the resulting coating, an in situ cathode polishing system must be used. DLC-PDMS-h coating is the superior candidate coating material for anti-soiling applications where also hardness is required.
Resumo:
Atrial fibrillation is the most common arrhythmia requiring treatment. This Thesis investigated atrial fibrillation (AF) with a specific emphasis on atrial remodeling which was analysed from epidemiological, clinical and magnetocardiographic (MCG) perspectives. In the first study we evaluated in real-life clinical practice a population-based cohort of AF patients referred for their first elective cardioversion (CV). 183 consecutive patients were included of whom in 153 (84%) sinus rhythm (SR) was restored. Only 39 (25%) of those maintained SR for one year. Shorter duration of AF and the use of sotalol were the only characteristics associated with better restoration and maintenance of SR. During the one-year follow-up 40% of the patients ended up in permanent AF. Female gender and older age were associated with the acceptance of permanent AF. The LIFE-trial was a prospective, randomised, double-blinded study that evaluated losartan and atenolol in patients with hypertension and left ventricular hypertrophy (LVH). Of the 8,851 patients with SR at baseline and without a history of AF 371 patients developed new-onset AF during the study. Patients with new-onset AF had an increased risk of cardiac events, stroke, and increased rate of hospitalisation for heart failure. Younger age, female gender, lower systolic blood pressure, lesser LVH in ECG and randomisation to losartan therapy were independently associated with lower frequency of new-onset AF. The impact of AF on morbidity and mortality was evaluated in a post-hoc analysis of the OPTIMAAL trial that compared losartan with captopril in patients with acute myocardial infarction (AMI) and evidence of LV dysfunction. Of the 5,477 randomised patients 655 had AF at baseline, and 345 patients developed new AF during the follow-up period, median 3.0 years. Older patients and patients with signs of more serious heart disease had and developed AF more often. Patients with AF at baseline had an increased risk of mortality (hazard ratio (HR) of 1.32) and stroke (HR 1.77). New-onset AF was associated with increased mortality (HR 1.82) and stroke (HR of 2.29). In the fourth study we assessed the reproducibility of our MCG method. This method was used in the fifth study where 26 patients with persistent AF had immediately after the CV longer P-wave duration and higher energy of the last portion of atrial signal (RMS40) in MCG, increased P-wave dispersion in SAECG and decreased pump function of the atria as well as enlarged atrial diameter in echocardiography compared to age- and disease-matched controls. After one month in SR, P-wave duration in MCG still remained longer and left atrial (LA) diameter greater compared to the controls, while the other measurements had returned to the same level as in the control group. In conclusion is not a rare condition in either general population or patients with hypertension or AMI, and it is associated with increased risk of morbidity and mortality. Therefore, atrial remodeling that increases the likelihood of AF and also seems to be relatively stable has to be identified and prevented. MCG was found to be an encouraging new method to study electrical atrial remodeling and reverse remodeling. RAAS-suppressing medications appear to be the most promising method to prevent atrial remodeling and AF.
Resumo:
Atopy-related allergic diseases, i.e. allergic rhinoconjunctivitis, atopic dermatitis and asthma, have increased in frequency in the industrialized countries. In order to reverse this trend, effective preventive strategies need to be developed. This requires a better understanding of the early-life events leading to the expression of the atopic phenotype. The present study has aimed at defining early-life factors and markers associated with the subsequent development of allergic diseases in a cohort of 200 healthy, unselected Finnish newborns prospectively followed up from birth to age 20 years. Their mothers were encouraged to start and maintain exclusive breastfeeding as long as it was nutritionally sufficient for the infant. Consequently, all the infants received some duration of exclusive breastfeeding, 58% of the infants were on exclusive breastfeeding for the first 6 months of life, and 18% received this feeding at least for the first 9 months. Of the infants, 42% had a family history of allergy. After the first year of follow-up, the children were re-assessed at ages 5, 11 and 20 years with clinical examination, skin prick testing, and parental and personal interviews. Exclusive breastfeeding for over 9 months was associated with atopic dermatitis and symptoms of food hypersensitivity at age 5 years, and with symptoms of food hypersensitivity at age 11 years in the children with a familial allergy. Subjects with allergic symptoms or a positive skin prick test in childhood or adolescence had lower retinol concentrations during their infancy and childhood than others. An elevated cord serum immunoglobulin E concentration predicted subsequent atopic manifestations though with modest sensitivity. Children and adolescents with allergic symptoms, skin prick test positivity and an elevated IgE had lower total cholesterol levels in infancy and childhood than the nonatopic subjects. In conclusion, prolonging strictly exclusive breastfeeding for over 9 months of age was not of help in prevention of allergic symptoms; instead, it was associated with increased atopic dermatitis and food hypersensitivity symptoms in childhood. Due to the modest sensitivity, cord serum IgE is not an effective screening method for atopic predisposition in the general population. Retinol and cholesterol concentrations in infancy were inversely associated with the subsequent development of allergic symptoms. Based on these findings, it is proposed that there may be differences in the inborn regulation of retinol and cholesterol levels in children with and without a genetic susceptibility to atopy, and these may play a role in the development of atopic sensitization and allergic diseases.
Resumo:
In atherosclerosis, cholesterol accumulates in the vessel wall, mainly in the form of modified low-density lipoprotein (LDL). Macrophages of the vessel wall scavenge cholesterol, which leads to formation of lipid-laden foam cells. High plasma levels of high-density lipoprotein (HDL) protect against atherosclerosis, as HDL particles can remove peripheral cholesterol and transport it to the liver for excretion in a process called reverse cholesterol transport (RCT). Phospholipid transfer protein (PLTP) remodels HDL particles in the circulation, generating prebeta-HDL and large fused HDL particles. In addition, PLTP maintains plasma HDL levels by facilitating the transfer of post-lipolytic surface remnants of triglyceride-rich lipoproteins to HDL. Most of the cholesteryl ester transfer protein (CETP) in plasma is bound to HDL particles and CETP is also involved in the remodeling of HDL particles. CETP enhances the heteroexchange of cholesteryl esters in HDL particles for triglycerides in LDL and very low-density lipoprotein (VLDL). The aim of this thesis project was to study the importance of endogenous PLTP in the removal of cholesterol from macrophage foam cells by using macrophages derived from PLTP-deficient mice, determine the effect of macrophage-derived PLTP on the development of atherosclerosis by using bone marrow transplantation, and clarify the role of the two forms of PLTP, active and inactive, in the removal of cholesterol from the foam cells. In addition, the ability of CETP to protect HDL against the action of chymase was studied. Finally, cholesterol efflux potential of sera obtained from the study subjects was compared. The absence of PLTP in macrophages derived from PLTP-deficient mice decreased cholesterol efflux mediated by ATP-binding cassette transporter A1. The bone marrow transplantation studies showed that selective deficiency of PLTP in macrophages decreased the size of atherosclerotic lesions and caused major changes in serum lipoprotein levels. It was further demonstrated that the active form of PLTP can enhance cholesterol efflux from macrophage foam cells through generation of prebeta-HDL and large fused HDL particles enriched with apoE and phospholipids. Also CETP may enhance the RCT process, as association of CETP with reconstituted HDL particles prevented chymase-dependent proteolysis of these particles and preserved their cholesterol efflux potential. Finally, serum from high-HDL subjects promoted more efficient cholesterol efflux than did serum derived from low-HDL subjects which was most probably due to differences in the distribution of HDL subpopulations in low-HDL and high-HDL subjects. These studies described in this thesis contribute to the understanding of the PLTP/CETP-associated mechanisms underlying RCT.
Resumo:
By detecting leading protons produced in the Central Exclusive Diffractive process, p+p → p+X+p, one can measure the missing mass, and scan for possible new particle states such as the Higgs boson. This process augments - in a model independent way - the standard methods for new particle searches at the Large Hadron Collider (LHC) and will allow detailed analyses of the produced central system, such as the spin-parity properties of the Higgs boson. The exclusive central diffractive process makes possible precision studies of gluons at the LHC and complements the physics scenarios foreseen at the next e+e− linear collider. This thesis first presents the conclusions of the first systematic analysis of the expected precision measurement of the leading proton momentum and the accuracy of the reconstructed missing mass. In this initial analysis, the scattered protons are tracked along the LHC beam line and the uncertainties expected in beam transport and detection of the scattered leading protons are accounted for. The main focus of the thesis is in developing the necessary radiation hard precision detector technology for coping with the extremely demanding experimental environment of the LHC. This will be achieved by using a 3D silicon detector design, which in addition to the radiation hardness of up to 5×10^15 neutrons/cm2, offers properties such as a high signal-to- noise ratio, fast signal response to radiation and sensitivity close to the very edge of the detector. This work reports on the development of a novel semi-3D detector design that simplifies the 3D fabrication process, but conserves the necessary properties of the 3D detector design required in the LHC and in other imaging applications.
Resumo:
We present a search for associated production of the standard model (SM) Higgs boson and a $Z$ boson where the $Z$ boson decays to two leptons and the Higgs decays to a pair of $b$ quarks in $p\bar{p}$ collisions at the Fermilab Tevatron. We use event probabilities based on SM matrix elements to construct a likelihood function of the Higgs content of the data sample. In a CDF data sample corresponding to an integrated luminosity of 2.7 fb$^{-1}$ we see no evidence of a Higgs boson with a mass between 100 GeV$/c^2$ and 150 GeV$/c^2$. We set 95% confidence level (C.L.) upper limits on the cross-section for $ZH$ production as a function of the Higgs boson mass $m_H$; the limit is 8.2 times the SM prediction at $m_H = 115$ GeV$/c^2$.
Resumo:
A precision measurement of the top quark mass m_t is obtained using a sample of ttbar events from ppbar collisions at the Fermilab Tevatron with the CDF II detector. Selected events require an electron or muon, large missing transverse energy, and exactly four high-energy jets, at least one of which is tagged as coming from a b quark. A likelihood is calculated using a matrix element method with quasi-Monte Carlo integration taking into account finite detector resolution and jet mass effects. The event likelihood is a function of m_t and a parameter DJES to calibrate the jet energy scale /in situ/. Using a total of 1087 events, a value of m_t = 173.0 +/- 1.2 GeV/c^2 is measured.
Resumo:
We report a measurement of the top quark mass, m_t, obtained from ppbar collisions at sqrt(s) = 1.96 TeV at the Fermilab Tevatron using the CDF II detector. We analyze a sample corresponding to an integrated luminosity of 1.9 fb^-1. We select events with an electron or muon, large missing transverse energy, and exactly four high-energy jets in the central region of the detector, at least one of which is tagged as coming from a b quark. We calculate a signal likelihood using a matrix element integration method, with effective propagators to take into account assumptions on event kinematics. Our event likelihood is a function of m_t and a parameter JES that determines /in situ/ the calibration of the jet energies. We use a neural network discriminant to distinguish signal from background events. We also apply a cut on the peak value of each event likelihood curve to reduce the contribution of background and badly reconstructed events. Using the 318 events that pass all selection criteria, we find m_t = 172.7 +/- 1.8 (stat. + JES) +/- 1.2 (syst.) GeV/c^2.
Resumo:
We present a measurement of the top quark mass with t-tbar dilepton events produced in p-pbar collisions at the Fermilab Tevatron $\sqrt{s}$=1.96 TeV and collected by the CDF II detector. A sample of 328 events with a charged electron or muon and an isolated track, corresponding to an integrated luminosity of 2.9 fb$^{-1}$, are selected as t-tbar candidates. To account for the unconstrained event kinematics, we scan over the phase space of the azimuthal angles ($\phi_{\nu_1},\phi_{\nu_2}$) of neutrinos and reconstruct the top quark mass for each $\phi_{\nu_1},\phi_{\nu_2}$ pair by minimizing a $\chi^2$ function in the t-tbar dilepton hypothesis. We assign $\chi^2$-dependent weights to the solutions in order to build a preferred mass for each event. Preferred mass distributions (templates) are built from simulated t-tbar and background events, and parameterized in order to provide continuous probability density functions. A likelihood fit to the mass distribution in data as a weighted sum of signal and background probability density functions gives a top quark mass of $165.5^{+{3.4}}_{-{3.3}}$(stat.)$\pm 3.1$(syst.) GeV/$c^2$.
New Method for Delexicalization and its Application to Prosodic Tagging for Text-to-Speech Synthesis
Resumo:
This paper describes a new flexible delexicalization method based on glottal excited parametric speech synthesis scheme. The system utilizes inverse filtered glottal flow and all-pole modelling of the vocal tract. The method provides a possibil- ity to retain and manipulate all relevant prosodic features of any kind of speech. Most importantly, the features include voice quality, which has not been properly modeled in earlier delex- icalization methods. The functionality of the new method was tested in a prosodic tagging experiment aimed at providing word prominence data for a text-to-speech synthesis system. The ex- periment confirmed the usefulness of the method and further corroborated earlier evidence that linguistic factors influence the perception of prosodic prominence.
Resumo:
In this study we explore the concurrent, combined use of three research methods, statistical corpus analysis and two psycholinguistic experiments (a forced-choice and an acceptability rating task), using verbal synonymy in Finnish as a case in point. In addition to supporting conclusions from earlier studies concerning the relationships between corpus-based and ex- perimental data (e. g., Featherston 2005), we show that each method adds to our understanding of the studied phenomenon, in a way which could not be achieved through any single method by itself. Most importantly, whereas relative rareness in a corpus is associated with dispreference in selection, such infrequency does not categorically always entail substantially lower acceptability. Furthermore, we show that forced-choice and acceptability rating tasks pertain to distinct linguistic processes, with category-wise in- commensurable scales of measurement, and should therefore be merged with caution, if at all.
Resumo:
When authors of scholarly articles decide where to submit their manuscripts for peer review and eventual publication, they often base their choice of journals on very incomplete information abouthow well the journals serve the authors’ purposes of informing about their research and advancing their academic careers. The purpose of this study was to develop and test a new method for benchmarking scientific journals, providing more information to prospective authors. The method estimates a number of journal parameters, including readership, scientific prestige, time from submission to publication, acceptance rate and service provided by the journal during the review and publication process. Data directly obtainable from the web, data that can be calculated from such data, data obtained from publishers and editors, and data obtained using surveys with authors are used in the method, which has been tested on three different sets of journals, each from a different discipline. We found a number of problems with the different data acquisition methods, which limit the extent to which the method can be used. Publishers and editors are reluctant to disclose important information they have at hand (i.e. journal circulation, web downloads, acceptance rate). The calculation of some important parameters (for instance average time from submission to publication, regional spread of authorship) can be done but requires quite a lot of work. It can be difficult to get reasonable response rates to surveys with authors. All in all we believe that the method we propose, taking a “service to authors” perspective as a basis for benchmarking scientific journals, is useful and can provide information that is valuable to prospective authors in selected scientific disciplines.