18 resultados para Compound variables

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rare-gas chemistry is of growing interest, and the recent advances include the "insertion" of a Xe atom into OH and water in the rare-gas hydrides HXeO and HXeOH. The insertion of Xe atoms into the H-C bonds of hydrocarbons was also demonstrated for HXeCC, HXeCCH and HXeCCXeH, the last of which was the first rare-gas hydride containing two rare-gas atoms. We describe the preparation and characterization of a new rare-gas compound, HXeOXeH. HXeOXeH was prepared in solid xenon by photolysis of a suitable precursor, for example water, and subsequent mobilization of the photoproducts. The experimental identification was carried out by FTIR spectroscopy, isotopic substitution and by use of various precursors. The photolytical and thermal stability of the new rare-gas hydride was also studied. The experimental work was supported by extensive quantum chemical calculations provided by our co-workers. HXeOXeH forms in a cryogenic xenon matrix from neutral O and H atoms in a two-step diffusion-controlled process involving HXeO as an intermediate [reactions (1) and (2)]. This formation mechanism is unique in that a rare-gas hydride is formed from another rare-gas hydride. H + Xe + O → HXeO (1) HXeO + Xe + H → HXeOXeH (2) Similarly to other rare-gas hydrides, HXeOXeH has a strongly IR-active H-Xe stretching vibration, allowing its spectral detection at 1379.3 cm-1. HXeOXeH is a very high-energy metastable species, yet thermally more stable than many other rare-gas hydrides. The calculated bending barrier of 0.57 eV, is not enough to explain the observed stability, and HXeOXeH might be affected by additional stabilization from the solid xenon environment. Chemical bonding between xenon and environmentally abundant species like water is of particular importance due to the “missing-xenon” problem. The relatively high thermal stability of HXeOXeH compared to other oxygen containing rare-gas compounds is relevant in this respect. Our work also raises the possibility of polymeric (–Xe–O)n networks, similarly to the computationally studied (XeCC)n polymers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Patients may need massive volume-replacement therapy after cardiac surgery because of large fluid transfer perioperatively, and the use of cardiopulmonary bypass. Hemodynamic stability is better maintained with colloids than crystalloids but colloids have more adverse effects such as coagulation disturbances and impairment of renal function than do crystalloids. The present study examined the effects of modern hydroxyethyl starch (HES) and gelatin solutions on blood coagulation and hemodynamics. The mechanism by which colloids disturb blood coagulation was investigated by thromboelastometry (TEM) after cardiac surgery and in vitro by use of experimental hemodilution. Materials and methods: Ninety patients scheduled for elective primary cardiac surgery (Studies I, II, IV, V), and twelve healthy volunteers (Study III) were included in this study. After admission to the cardiac surgical intensive care unit (ICU), patients were randomized to receive different doses of HES 130/0.4, HES 200/0.5, or 4% albumin solutions. Ringer’s acetate or albumin solutions served as controls. Coagulation was assessed by TEM, and hemodynamic measurements were based on thermodilutionally measured cardiac index (CI). Results: HES and gelatin solutions impaired whole blood coagulation similarly as measured by TEM even at a small dose of 7 mL/kg. These solutions reduced clot strength and prolonged clot formation time. These effects were more pronounced with increasing doses of colloids. Neither albumin nor Ringer’s acetate solution disturbed blood coagulation significantly. Coagulation disturbances after infusion of HES or gelatin solutions were clinically slight, and postoperative blood loss was comparable with that of Ringer’s acetate or albumin solutions. Both single and multiple doses of all the colloids increased CI postoperatively, and this effect was dose-dependent. Ringer’s acetate had no effect on CI. At a small dose (7 mL/kg), the effect of gelatin on CI was comparable with that of Ringer’s acetate and significantly less than that of HES 130/0.4 (Study V). However, when the dose was increased to 14 and 21 mL/kg, the hemodynamic effect of gelatin rose and became comparable with that of HES 130/0.4. Conclusions: After cardiac surgery, HES and gelatin solutions impaired clot strength in a dose-dependent manner. The potential mechanisms were interaction with fibrinogen and fibrin formation, resulting in decreased clot strength, and hemodilution. Although the use of HES and gelatin inhibited coagulation, postoperative bleeding on the first postoperative morning in all the study groups was similar. A single dose of HES solutions improved CI postoperatively more than did gelatin, albumin, or Ringer’s acetate. However, when administered in a repeated fashion, (cumulative dose of 14 mL/kg or more), no differences were evident between HES 130/0.4 and gelatin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the studies was to improve the diagnostic capability of electrocardiography (ECG) in detecting myocardial ischemic injury with a future goal of an automatic screening and monitoring method for ischemic heart disease. The method of choice was body surface potential mapping (BSPM), containing numerous leads, with intention to find the optimal recording sites and optimal ECG variables for ischemia and myocardial infarction (MI) diagnostics. The studies included 144 patients with prior MI, 79 patients with evolving ischemia, 42 patients with left ventricular hypertrophy (LVH), and 84 healthy controls. Study I examined the depolarization wave in prior MI with respect to MI location. Studies II-V examined the depolarization and repolarization waves in prior MI detection with respect to the Minnesota code, Q-wave status, and study V also with respect to MI location. In study VI the depolarization and repolarization variables were examined in 79 patients in the face of evolving myocardial ischemia and ischemic injury. When analyzed from a single lead at any recording site the results revealed superiority of the repolarization variables over the depolarization variables and over the conventional 12-lead ECG methods, both in the detection of prior MI and evolving ischemic injury. The QT integral, covering both depolarization and repolarization, appeared indifferent to the Q-wave status, the time elapsed from MI, or the MI or ischemia location. In the face of evolving ischemic injury the performance of the QT integral was not hampered even by underlying LVH. The examined depolarization and repolarization variables were effective when recorded in a single site, in contrast to the conventional 12-lead ECG criteria. The inverse spatial correlation of the depolarization and depolarization waves in myocardial ischemia and injury could be reduced into the QT integral variable recorded in a single site on the left flank. In conclusion, the QT integral variable, detectable in a single lead, with optimal recording site on the left flank, was able to detect prior MI and evolving ischemic injury more effectively than the conventional ECG markers. The QT integral, in a single-lead or a small number of leads, offers potential for automated screening of ischemic heart disease, acute ischemia monitoring and therapeutic decision-guiding as well as risk stratification.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main method of modifying properties of semiconductors is to introduce small amount of impurities inside the material. This is used to control magnetic and optical properties of materials and to realize p- and n-type semiconductors out of intrinsic material in order to manufacture fundamental components such as diodes. As diffusion can be described as random mixing of material due to thermal movement of atoms, it is essential to know the diffusion behavior of the impurities in order to manufacture working components. In modified radiotracer technique diffusion is studied using radioactive isotopes of elements as tracers. The technique is called modified as atoms are deployed inside the material by ion beam implantation. With ion implantation, a distinct distribution of impurities can be deployed inside the sample surface with good con- trol over the amount of implanted atoms. As electromagnetic radiation and other nuclear decay products emitted by radioactive materials can be easily detected, only very low amount of impurities can be used. This makes it possible to study diffusion in pure materials without essentially modifying the initial properties by doping. In this thesis a modified radiotracer technique is used to study the diffusion of beryllium in GaN, ZnO, SiGe and glassy carbon. GaN, ZnO and SiGe are of great interest to the semiconductor industry and beryllium as a small and possibly rapid dopant hasn t been studied previously using the technique. Glassy carbon has been added to demonstrate the feasibility of the technique. In addition, the diffusion of magnetic impurities, Mn and Co, has been studied in GaAs and ZnO (respectively) with spintronic applications in mind.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatile organic compounds (VOCs) are emitted into the atmosphere from natural and anthropogenic sources, vegetation being the dominant source on a global scale. Some of these reactive compounds are deemed major contributors or inhibitors to aerosol particle formation and growth, thus making VOC measurements essential for current climate change research. This thesis discusses ecosystem scale VOC fluxes measured above a boreal Scots pine dominated forest in southern Finland. The flux measurements were performed using the micrometeorological disjunct eddy covariance (DEC) method combined with proton transfer reaction mass spectrometry (PTR-MS), which is an online technique for measuring VOC concentrations. The measurement, calibration, and calculation procedures developed in this work proved to be well suited to long-term VOC concentration and flux measurements with PTR-MS. A new averaging approach based on running averaged covariance functions improved the determination of the lag time between wind and concentration measurements, which is a common challenge in DEC when measuring fluxes near the detection limit. The ecosystem scale emissions of methanol, acetaldehyde, and acetone were substantial. These three oxygenated VOCs made up about half of the total emissions, with the rest comprised of monoterpenes. Contrary to the traditional assumption that monoterpene emissions from Scots pine originate mainly as evaporation from specialized storage pools, the DEC measurements indicated a significant contribution from de novo biosynthesis to the ecosystem scale monoterpene emissions. This thesis offers practical guidelines for long-term DEC measurements with PTR-MS. In particular, the new averaging approach to the lag time determination seems useful in the automation of DEC flux calculations. Seasonal variation in the monoterpene biosynthesis and the detailed structure of a revised hybrid algorithm, describing both de novo and pool emissions, should be determined in further studies to improve biological realism in the modelling of monoterpene emissions from Scots pine forests. The increasing number of DEC measurements of oxygenated VOCs will probably enable better estimates of the role of these compounds in plant physiology and tropospheric chemistry. Keywords: disjunct eddy covariance, lag time determination, long-term flux measurements, proton transfer reaction mass spectrometry, Scots pine forests, volatile organic compounds

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first line medication for mild to moderate Alzheimer s disease (AD) is based on cholinesterase inhibitors which prolong the effect of the neurotransmitter acetylcholine in cholinergic nerve synapses which relieves the symptoms of the disease. Implications of cholinesterases involvement in disease modifying processes has increased interest in this research area. The drug discovery and development process is a long and expensive process that takes on average 13.5 years and costs approximately 0.9 billion US dollars. Drug attritions in the clinical phases are common due to several reasons, e.g., poor bioavailability of compounds leading to low efficacy or toxic effects. Thus, improvements in the early drug discovery process are needed to create highly potent non-toxic compounds with predicted drug-like properties. Nature has been a good source for the discovery of new medicines accounting for around half of the new drugs approved to market during the last three decades. These compounds are direct isolates from the nature, their synthetic derivatives or natural mimics. Synthetic chemistry is an alternative way to produce compounds for drug discovery purposes. Both sources have pros and cons. The screening of new bioactive compounds in vitro is based on assaying compound libraries against targets. Assay set-up has to be adapted and validated for each screen to produce high quality data. Depending on the size of the library, miniaturization and automation are often requirements to reduce solvent and compound amounts and fasten the process. In this contribution, natural extract, natural pure compound and synthetic compound libraries were assessed as sources for new bioactive compounds. The libraries were screened primarily for acetylcholinesterase inhibitory effect and secondarily for butyrylcholinesterase inhibitory effect. To be able to screen the libraries, two assays were evaluated as screening tools and adapted to be compatible with special features of each library. The assays were validated to create high quality data. Cholinesterase inhibitors with various potencies and selectivity were found in natural product and synthetic compound libraries which indicates that the two sources complement each other. It is acknowledged that natural compounds differ structurally from compounds in synthetic compound libraries which further support the view of complementation especially if a high diversity of structures is the criterion for selection of compounds in a library.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, thanks to developments in information technology, large-dimensional datasets have been increasingly available. Researchers now have access to thousands of economic series and the information contained in them can be used to create accurate forecasts and to test economic theories. To exploit this large amount of information, researchers and policymakers need an appropriate econometric model.Usual time series models, vector autoregression for example, cannot incorporate more than a few variables. There are two ways to solve this problem: use variable selection procedures or gather the information contained in the series to create an index model. This thesis focuses on one of the most widespread index model, the dynamic factor model (the theory behind this model, based on previous literature, is the core of the first part of this study), and its use in forecasting Finnish macroeconomic indicators (which is the focus of the second part of the thesis). In particular, I forecast economic activity indicators (e.g. GDP) and price indicators (e.g. consumer price index), from 3 large Finnish datasets. The first dataset contains a large series of aggregated data obtained from the Statistics Finland database. The second dataset is composed by economic indicators from Bank of Finland. The last dataset is formed by disaggregated data from Statistic Finland, which I call micro dataset. The forecasts are computed following a two steps procedure: in the first step I estimate a set of common factors from the original dataset. The second step consists in formulating forecasting equations including the factors extracted previously. The predictions are evaluated using relative mean squared forecast error, where the benchmark model is a univariate autoregressive model. The results are dataset-dependent. The forecasts based on factor models are very accurate for the first dataset (the Statistics Finland one), while they are considerably worse for the Bank of Finland dataset. The forecasts derived from the micro dataset are still good, but less accurate than the ones obtained in the first case. This work leads to multiple research developments. The results here obtained can be replicated for longer datasets. The non-aggregated data can be represented in an even more disaggregated form (firm level). Finally, the use of the micro data, one of the major contributions of this thesis, can be useful in the imputation of missing values and the creation of flash estimates of macroeconomic indicator (nowcasting).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study presents a theory of utility models based on aspiration levels, as well as the application of this theory to the planning of timber flow economics. The first part of the study comprises a derivation of the utility-theoretic basis for the application of aspiration levels. Two basic models are dealt with: the additive and the multiplicative. Applied here solely for partial utility functions, aspiration and reservation levels are interpreted as defining piecewisely linear functions. The standpoint of the choices of the decision-maker is emphasized by the use of indifference curves. The second part of the study introduces a model for the management of timber flows. The model is based on the assumption that the decision-maker is willing to specify a shape of income flow which is different from that of the capital-theoretic optimum. The utility model comprises four aspiration-based compound utility functions. The theory and the flow model are tested numerically by computations covering three forest holdings. The results show that the additive model is sensitive even to slight changes in relative importances and aspiration levels. This applies particularly to nearly linear production possibility boundaries of monetary variables. The multiplicative model, on the other hand, is stable because it generates strictly convex indifference curves. Due to a higher marginal rate of substitution, the multiplicative model implies a stronger dependence on forest management than the additive function. For income trajectory optimization, a method utilizing an income trajectory index is more efficient than one based on the use of aspiration levels per management period. Smooth trajectories can be attained by squaring the deviations of the feasible trajectories from the desired one.