945 resultados para Precision Xtra®
Resumo:
Pre-eclampsia is a pregnancy complication that affects about 5% of all pregnancies. It is known to be associated with alterations in angiogenesis -related factors, such as vascular endothelial growth factor (VEGF). An excess of antiangiogenic substances, especially the soluble receptor-1 of VEGF (sVEGFR-1), has been observed in maternal circulation after the onset of the disease, probably reflecting their increased placental production. Smoking reduces circulating concentrations of sVEGFR-1 in non-pregnant women, and in pregnant women it reduces the risk of pre-eclampsia. Soluble VEGFR-1 acts as a natural antagonist of VEGF and placental growth factor (PlGF) in human circulation, holding a promise for potential therapeutic use. In fact, it has been used as a model to generate a fusion protein, VEGF Trap , which has been found effective in anti-angiogenic treatment of certain tumors and ocular diseases. In the present study, we evaluated the potential use of maternal serum sVEGFR-1, Angiopoietin-2 (Ang-2) and endostatin, three central anti-angiogenic markers, in early prediction of subsequent pre-eclampsia. We also studied whether smoking affects circulating sVEGFR-1 concentrations in pregnant women or their first trimester placental secretion and expression in vitro. Last, in order to allow future discussion on the potential therapy based on sVEGFR-1, we determined the biological half-life of endogenous sVEGFR-1 in human circulation, and measured the concomitant changes in free VEGF concentrations. Blood or placental samples were collected from a total of 268 pregnant women between the years 2001 2007 in Helsinki University Central Hospital for the purposes above. The biomarkers were measured using commercially available enzyme-linked immunosorbent assays (ELISA). For the analyses of sVEGFR-1, Ang-2 and endostatin, a total of 3 240 pregnant women in the Helsinki area were admitted to blood sample collection during two routine ultrasoundscreening visits at 13.7 ± 0.5 (mean ± SD) and 19.2 ± 0.6 weeks of gestation. Of them, 49 women later developing pre-eclampsia were included in the study. Their disease was further classified as mild in 29 and severe in 20 patients. Isolated early-onset intrauterine growth retardation (IUGR) was diagnosed in 16 women with otherwise normal medical histories and uncomplicated pregnancies. Fifty-nine women remaining normotensive, non-proteinuric and finally giving birth to normal-weight infants were picked to serve as the control population of the study. Maternal serum concentrations of Ang-2, endostatin and sVEGFR-1, were increased already at 16 20 weeks of pregnancy, about 13 weeks before the clinical manifestation of preeclampsia. In addition, these biomarkers could be used to identify women at risk with a moderate precision. However, larger patient series are needed to determine whether these markers could be applied for clinical use to predict preeclampsia. Intrauterine growth retardation (IUGR), especially if noted at early stages of pregnancy and not secondary to any other pregnancy complication, has been suggested to be a form of preeclampsia compromising only the placental sufficiency and the fetus, but not affecting the maternal endothelium. In fact, IUGR and preeclampsia have been proposed to share a common vascular etiology in which factors regulating early placental angiogenesis are likely to play a central role. Thus, these factors have been suggested to be involved in the pathogenesis of IUGR. However, circulating sVEGFR-1, Ang-2 and endostatin concentrations were unaffected by subsequent IUGR at early second trimester. Furthermore, smoking was not associated with alterations in maternal circulating sVEGFR-1 or its placental production. The elimination of endogenous sVEGFR-1 after pregnancy was calculated from serial samples of eight pregnant women undergoing elective Caesarean section. As typical for proteins in human compartments, the elimination of sVEGFR-1 was biphasic, containing a rapid halflife of 3.4 h and a slow one of 29 h. The decline in sVEGFR-1 concentrations after mid-trimester legal termination of pregnancy was accompanied with a simultaneous increase in the serum levels of free VEGF so that within a few days after pregnancy VEGF dominated in the maternal circulation. Our study provides novel information on the kinetics of endogenous sVEGFR-1, which serves as a potential tool in the development of new strategies against diseases associated with angiogenic imbalance and alterations in VEGF signaling.
Resumo:
Physics at the Large Hadron Collider (LHC) and the International e(+)e(-) Linear Collider (ILC) will be complementary in many respects, as has been demonstrated at previous generations of hadron and lepton colliders. This report addresses the possible interplay between the LHC and ILC in testing the Standard Model and in discovering and determining the origin of new physics. Mutual benefits for the physics programme at both machines can occur both at the level of a combined interpretation of Hadron Collider and Linear Collider data and at the level of combined analyses of the data, where results obtained at one machine can directly influence the way analyses are carried out at the other machine. Topics under study comprise the physics of weak and strong electroweak symmetry breaking, supersymmetric models, new gauge theories, models with extra dimensions, and electroweak and QCD precision physics. The status of the work that has been carried out within the LHC/ILC Study Group so far is summarized in this report. Possible topics for future studies are outlined.
Resumo:
We present a general formalism for deriving bounds on the shape parameters of the weak and electromagnetic form factors using as input correlators calculated from perturbative QCD, and exploiting analyticity and unitarily. The values resulting from the symmetries of QCD at low energies or from lattice calculations at special points inside the analyticity domain can be included in an exact way. We write down the general solution of the corresponding Meiman problem for an arbitrary number of interior constraints and the integral equations that allow one to include the phase of the form factor along a part of the unitarity cut. A formalism that includes the phase and some information on the modulus along a part of the cut is also given. For illustration we present constraints on the slope and curvature of the K-l3 scalar form factor and discuss our findings in some detail. The techniques are useful for checking the consistency of various inputs and for controlling the parameterizations of the form factors entering precision predictions in flavor physics.
Resumo:
The methods of secondary wood processing are assumed to evolve over time and to affect the requirements set for the wood material and its suppliers. The study aimed at analysing the industrial operating modes applied by joinery and furniture manufacturers as sawnwood users. Industrial operating mode was defined as a pattern of important decisions and actions taken by a company which describes the company's level of adjustment in the late-industrial transition. A non-probabilistic sample of 127 companies was interviewed, including companies from Denmark, Germany, the Netherlands, and Finland. Fifty-two of the firms were furniture manufacturers and the other 75 were producing windows and doors. Variables related to business philosophy, production operations, and supplier choice criteria were measured and used as a basis for a customer typology; variables related to wood usage and perceived sawmill performance were measured to be used to profile the customer types. Factor analysis was used to determine the latent dimensions of industrial operating mode. Canonical correlations analysis was applied in developing the final base for classifying the observations. Non-hierarchical cluster analysis was employed to build a five-group typology of secondary wood processing firms; these ranged from traditional mass producers to late-industrial flexible manufacturers. There is a clear connection between the amount of late-industrial elements in a company and the share of special and customised sawnwood it uses. Those joinery or furniture manufacturers that are more late-industrial also are likely to use more component-type wood material and to appreciate customer-oriented technical precision. The results show that the change is towards the use of late-industrial sawnwood materials and late-industrial supplier relationships.
Resumo:
Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.
Resumo:
Neurons can be divided into various classes according to their location, morphology, neurochemical identity and electrical properties. They form complex interconnected networks with precise roles for each cell type. GABAergic neurons expressing the calcium-binding protein parvalbumin (Pv) are mainly interneurons, which serve a coordinating function. Pv-cells modulate the activity of principal cells with high temporal precision. Abnormalities of Pv-interneuron activity in cortical areas have been linked to neuropsychiatric illnesses such as schizophrenia. Cerebellar Purkinje cells are known to be central to motor learning. They are the sole output from the layered cerebellar cortex to deep cerebellar nuclei. There are still many open questions about the precise role of Pv-neurons and Purkinje cells, many of which could be answered if one could achieve rapid, reversible cell-type specific modulation of the activity of these neurons and observe the subsequent changes at the whole-animal level. The aim of these studies was to develop a novel method for the modulation of Pv-neurons and Purkinje cells in vivo and to use this method to investigate the significance of inhibition in these neuronal types with a variety of behavioral experiments in addition to tissue autoradiography, electrophysiology and immunohistochemistry. The GABA(A) receptor γ2 subunit was ablated from Pv-neurons and Purkinje cells in four separate mouse lines. Pv-Δγ2 mice had wide-ranging behavioral alterations and increased GABA-insensitive binding indicative of an altered GABA(A) receptor composition, particularly in midbrain areas. PC-Δγ2 mice experienced little or no motor impairment despite the lack of inhibition in Purkinje cells. In Pv-Δγ2-partial rescue mice, a reversal of motor and cognitive deficits was observed in addition to restoration of the wild-type γ2F77 subunit to the reticular nucleus of thalamus and the cerebellar molecular layer. In PC-Δγ2-swap mice, zolpidem sensitivity was restored to Purkinje cells and the administration of systemic zolpidem evoked a transient motor impairment. On the basis of these results, it is concluded that this new method of cell-type specific modulation is a feasible way to modulate the activity of selected neuronal types. The importance of Purkinje cells to motor control supports previous studies, and the crucial involvement of Pv-neurons in a range of behavioral modalities is confirmed.
Resumo:
One of the most important factors that affect the pointing of precision payloads and devices in space platforms is the vibration generated due to static and dynamic unbalanced forces of rotary equipments placed in the neighborhood of payload. Generally, such disturbances are of low amplitude, less than 1 kHz, and are termed as ‘micro-vibrations’. Due to low damping in the space structure, these vibrations have long decay time and they degrade the performance of payload. This paper addresses the design, modeling and analysis of a low frequency space frame platform for passive and active attenuation of micro-vibrations. This flexible platform has been designed to act as a mount for devices like reaction wheels, and consists of four folded continuous beams arranged in three dimensions. Frequency and response analysis have been carried out by varying the number of folds, and thickness of vertical beam. Results show that lower frequencies can be achieved by increasing the number of folds and by decreasing the thickness of the blade. In addition, active vibration control is studied by incorporating piezoelectric actuators and sensors in the dynamic model. It is shown using simulation that a control strategy using optimal control is effective for vibration suppression under a wide variety of loading conditions.
Resumo:
The study of soil microbiota and their activities is central to the understanding of many ecosystem processes such as decomposition and nutrient cycling. The collection of microbiological data from soils generally involves several sequential steps of sampling, pretreatment and laboratory measurements. The reliability of results is dependent on reliable methods in every step. The aim of this thesis was to critically evaluate some central methods and procedures used in soil microbiological studies in order to increase our understanding of the factors that affect the measurement results and to provide guidance and new approaches for the design of experiments. The thesis focuses on four major themes: 1) soil microbiological heterogeneity and sampling, 2) storage of soil samples, 3) DNA extraction from soil, and 4) quantification of specific microbial groups by the most-probable-number (MPN) procedure. Soil heterogeneity and sampling are discussed as a single theme because knowledge on spatial (horizontal and vertical) and temporal variation is crucial when designing sampling procedures. Comparison of adjacent forest, meadow and cropped field plots showed that land use has a strong impact on the degree of horizontal variation of soil enzyme activities and bacterial community structure. However, regardless of the land use, the variation of microbiological characteristics appeared not to have predictable spatial structure at 0.5-10 m. Temporal and soil depth-related patterns were studied in relation to plant growth in cropped soil. The results showed that most enzyme activities and microbial biomass have a clear decreasing trend in the top 40 cm soil profile and a temporal pattern during the growing season. A new procedure for sampling of soil microbiological characteristics based on stratified sampling and pre-characterisation of samples was developed. A practical example demonstrated the potential of the new procedure to reduce the analysis efforts involved in laborious microbiological measurements without loss of precision. The investigation of storage of soil samples revealed that freezing (-20 °C) of small sample aliquots retains the activity of hydrolytic enzymes and the structure of the bacterial community in different soil matrices relatively well whereas air-drying cannot be recommended as a storage method for soil microbiological properties due to large reductions in activity. Freezing below -70 °C was the preferred method of storage for samples with high organic matter content. Comparison of different direct DNA extraction methods showed that the cell lysis treatment has a strong impact on the molecular size of DNA obtained and on the bacterial community structure detected. An improved MPN method for the enumeration of soil naphthalene degraders was introduced as an alternative to more complex MPN protocols or the DNA-based quantification approach. The main advantage of the new method is the simple protocol and the possibility to analyse a large number of samples and replicates simultaneously.
Resumo:
The enzymes of the family of tRNA synthetases perform their functions with high precision by synchronously recognizing the anticodon region and the aminoacylation region, which are separated by ?70 in space. This precision in function is brought about by establishing good communication paths between the two regions. We have modeled the structure of the complex consisting of Escherichia coli methionyl-tRNA synthetase (MetRS), tRNA, and the activated methionine. Molecular dynamics simulations have been performed on the modeled structure to obtain the equilibrated structure of the complex and the cross-correlations between the residues in MetRS have been evaluated. Furthermore, the network analysis on these simulated structures has been carried out to elucidate the paths of communication between the activation site and the anticodon recognition site. This study has provided the detailed paths of communication, which are consistent with experimental results. Similar studies also have been carried out on the complexes (MetRS + activated methonine) and (MetRS + tRNA) along with ligand-free native enzyme. A comparison of the paths derived from the four simulations clearly has shown that the communication path is strongly correlated and unique to the enzyme complex, which is bound to both the tRNA and the activated methionine. The details of the method of our investigation and the biological implications of the results are presented in this article. The method developed here also could be used to investigate any protein system where the function takes place through long-distance communication.
Resumo:
Motivation: The number of bacterial genomes being sequenced is increasing very rapidly and hence, it is crucial to have procedures for rapid and reliable annotation of their functional elements such as promoter regions, which control the expression of each gene or each transcription unit of the genome. The present work addresses this requirement and presents a generic method applicable across organisms. Results: Relative stability of the DNA double helical sequences has been used to discriminate promoter regions from non-promoter regions. Based on the difference in stability between neighboring regions, an algorithm has been implemented to predict promoter regions on a large scale over 913 microbial genome sequences. The average free energy values for the promoter regions as well as their downstream regions are found to differ, depending on their GC content. Threshold values to identify promoter regions have been derived using sequences flanking a subset of translation start sites from all microbial genomes and then used to predict promoters over the complete genome sequences. An average recall value of 72% (which indicates the percentage of protein and RNA coding genes with predicted promoter regions assigned to them) and precision of 56% is achieved over the 913 microbial genome dataset.
Resumo:
We describe an automated calorimeter for measurement of specific heat in the temperature range 10 K>T>0.5 K. It uses sample of moderate size (100–1000 mg), has a moderate precision and accuracy (2%–5%), is easy to operate and the measurements can be done quickly with He4 economy. The accuracy of this calorimeter was checked by measurement of specific heat of copper and that of aluminium near its superconducting transition temperature.
Resumo:
According to the most prevalent view, there are 3-4 fixed "slots" in visual working memory for temporary storage. Recently this view has been challenged with a theory of dynamic resources which are restricted in their totality but can be freely allocated. The aim of this study is to clarify which one of the theories better describes the performance in visual working memory tasks with contour shapes. Thus in this study, the interest is in both the number of recalled stimuli and the precision of the memory representations. Stimuli in the experiments were radial frequency patterns, which were constructed by sinusoidally modulating the radius of a circle. Five observers participated in the experiment and it consisted of two different tasks. In the delayed discrimination task the number of recalled stimuli was measured with 2-interval forced choice task. Observer was shown serially two displays with 1, 5 s ISI (inter stimulus interval). Displays contained 1-6 patterns and they differed from each other with changed amplitude in one pattern. The participant s task was to report whether the changed pattern had higher amplitude in the first or in the second interval. The amount of amplitude change was defined with QUEST-procedure and the 75 % discrimination threshold was measured in the task. In the recall task the precision of the memory representations was measured with subjective adjustment method. First, observer was shown 1-6 patterns and after 1, 5 s ISI one location of the previously shown pattern was cued. Observer s task was to adjust amplitude of a probe pattern to match the amplitude of the pattern in working memory. In the delayed discrimination task the performance of all observes declined smoothly when the number of presented patterns was increased. The result supports the resource theory of working memory as there was no sudden fall in the performance. The amplitude threshold for one item was 0.01 0.05 and as the number of items increased from 1 to 6 there was a 4 15 -fold linear increase in the amplitude threshold (0.14 0.29). In the recall adjustment task the precision of four observers performance declined smoothly as the number of presented patterns was increased. The result also supports the resource theory. The standard deviation for one item was 0.03 0.05 and as the number of items increased from 1 to 6 there was a 2 3 -fold linear increase in the amplitude threshold (0.06 0.11). These findings show that the performance in a visual working memory task is described better according to the theory of freely allocated resources and not to the traditional slot-model. In addition, the allocation of the resources depends on the properties of the individual observer and the visual working memory task.
Resumo:
Low interlaminar strength and the consequent possibility of interlaminar failures in composite laminates demand an examination of interlaminar stresses and/or strains to ensure their satisfactory performance. As a first approximation, these stresses can be obtained from thickness-wise integration of ply equilibrium equations using in-plane stresses from the classical laminated plate theory. Implementation of this approach in the finite element form requires evaluation of third and fourth order derivatives of the displacement functions in an element. Hence, a high precision element developed by Jayachandrabose and Kirkhope (1985) is used here and the required derivatives are obtained in two ways. (i) from direct differentiation of element shape functions; and (ii) by adapting a finite difference technique applied to the nodal strains and curvatures obtained from the finite element analysis. Numerical results obtained for a three-layered symmetric and a two-layered asymmetric laminate show that the second scheme is quite effective compared to the first scheme particularly for the case of asymmetric laminates.
Resumo:
Language software applications encounter new words, e.g., acronyms, technical terminology, names or compounds of such words. In order to add new words to a lexicon, we need to indicate their inflectional paradigm. We present a new generally applicable method for creating an entry generator, i.e. a paradigm guesser, for finite-state transducer lexicons. As a guesser tends to produce numerous suggestions, it is important that the correct suggestions be among the first few candidates. We prove some formal properties of the method and evaluate it on Finnish, English and Swedish full-scale transducer lexicons. We use the open-source Helsinki Finite-State Technology to create finitestate transducer lexicons from existing lexical resources and automatically derive guessers for unknown words. The method has a recall of 82-87 % and a precision of 71-76 % for the three test languages. The model needs no external corpus and can therefore serve as a baseline.
Resumo:
Diffuse optical tomographic image reconstruction uses advanced numerical models that are computationally costly to be implemented in the real time. The graphics processing units (GPUs) offer desktop massive parallelization that can accelerate these computations. An open-source GPU-accelerated linear algebra library package is used to compute the most intensive matrix-matrix calculations and matrix decompositions that are used in solving the system of linear equations. These open-source functions were integrated into the existing frequency-domain diffuse optical image reconstruction algorithms to evaluate the acceleration capability of the GPUs (NVIDIA Tesla C 1060) with increasing reconstruction problem sizes. These studies indicate that single precision computations are sufficient for diffuse optical tomographic image reconstruction. The acceleration per iteration can be up to 40, using GPUs compared to traditional CPUs in case of three-dimensional reconstruction, where the reconstruction problem is more underdetermined, making the GPUs more attractive in the clinical settings. The current limitation of these GPUs in the available onboard memory (4 GB) that restricts the reconstruction of a large set of optical parameters, more than 13, 377. (C) 2010 Society of Photo-Optical Instrumentation Engineers. DOI: 10.1117/1.3506216]