198 resultados para utility measurement


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measurement of the blood pressure by the physician remains an essential step in the evaluation of cardiovascular risk. Ambulatory measurement and self-measurement of blood pressure are ways of counteracting the "white coat" effect which is the rise in blood pressure many patients experience in the presence of doctors. Thus, it is possible to define the cardiovascular risk of hypertension and identify the patients with the greatest chance of benefiting from antihypertensive therapy. However, it must be realised that normotensive subjects during their everyday activities and becoming hypertensive in the doctor's surgery, may become hypertensive with time, irrespective of the means used to measure blood pressure. These patients should be followed up regularly even if the decision to treat has been postponed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract : The occupational health risk involved with handling nanoparticles is the probability that a worker will experience an adverse health effect: this is calculated as a function of the worker's exposure relative to the potential biological hazard of the material. Addressing the risks of nanoparticles requires therefore knowledge on occupational exposure and the release of nanoparticles into the environment as well as toxicological data. However, information on exposure is currently not systematically collected; therefore this risk assessment lacks quantitative data. This thesis aimed at, first creating the fundamental data necessary for a quantitative assessment and, second, evaluating methods to measure the occupational nanoparticle exposure. The first goal was to determine what is being used where in Swiss industries. This was followed by an evaluation of the adequacy of existing measurement methods to assess workplace nanopaiticle exposure to complex size distributions and concentration gradients. The study was conceived as a series of methodological evaluations aimed at better understanding nanoparticle measurement devices and methods. lt focused on inhalation exposure to airborne particles, as respiration is considered to be the most important entrance pathway for nanoparticles in the body in terms of risk. The targeted survey (pilot study) was conducted as a feasibility study for a later nationwide survey on the handling of nanoparticles and the applications of specific protection means in industry. The study consisted of targeted phone interviews with health and safety officers of Swiss companies that were believed to use or produce nanoparticles. This was followed by a representative survey on the level of nanoparticle usage in Switzerland. lt was designed based on the results of the pilot study. The study was conducted among a representative selection of clients of the Swiss National Accident Insurance Fund (SUVA), covering about 85% of Swiss production companies. The third part of this thesis focused on the methods to measure nanoparticles. Several pre- studies were conducted studying the limits of commonly used measurement devices in the presence of nanoparticle agglomerates, This focus was chosen, because several discussions with users and producers of the measurement devices raised questions about their accuracy measuring nanoparticle agglomerates and because, at the same time, the two survey studies revealed that such powders are frequently used in industry. The first preparatory experiment focused on the accuracy of the scanning mobility particle sizer (SMPS), which showed an improbable size distribution when measuring powders of nanoparticle agglomerates. Furthermore, the thesis includes a series of smaller experiments that took a closer look at problems encountered with other measurement devices in the presence of nanoparticle agglomerates: condensation particle counters (CPC), portable aerosol spectrometer (PAS) a device to estimate the aerodynamic diameter, as well as diffusion size classifiers. Some initial feasibility tests for the efficiency of filter based sampling and subsequent counting of carbon nanotubes (CNT) were conducted last. The pilot study provided a detailed picture of the types and amounts of nanoparticles used and the knowledge of the health and safety experts in the companies. Considerable maximal quantities (> l'000 kg/year per company) of Ag, Al-Ox, Fe-Ox, SiO2, TiO2, and ZnO (mainly first generation particles) were declared by the contacted Swiss companies, The median quantity of handled nanoparticles, however, was 100 kg/year. The representative survey was conducted by contacting by post mail a representative selection of l '626 SUVA-clients (Swiss Accident Insurance Fund). It allowed estimation of the number of companies and workers dealing with nanoparticles in Switzerland. The extrapolation from the surveyed companies to all companies of the Swiss production sector suggested that l'309 workers (95%-confidence interval l'073 to l'545) of the Swiss production sector are potentially exposed to nanoparticles in 586 companies (145 to l'027). These numbers correspond to 0.08% (0.06% to 0.09%) of all workers and to 0.6% (0.2% to 1.1%) of companies in the Swiss production sector. To measure airborne concentrations of sub micrometre-sized particles, a few well known methods exist. However, it was unclear how well the different instruments perform in the presence of the often quite large agglomerates of nanostructured materials. The evaluation of devices and methods focused on nanoparticle agglomerate powders. lt allowed the identification of the following potential sources of inaccurate measurements at workplaces with considerable high concentrations of airborne agglomerates: - A standard SMPS showed bi-modal particle size distributions when measuring large nanoparticle agglomerates. - Differences in the range of a factor of a thousand were shown between diffusion size classifiers and CPC/SMPS. - The comparison between CPC/SMPS and portable aerosol Spectrometer (PAS) was much better, but depending on the concentration, size or type of the powders measured, the differences were still of a high order of magnitude - Specific difficulties and uncertainties in the assessment of workplaces were identified: the background particles can interact with particles created by a process, which make the handling of background concentration difficult. - Electric motors produce high numbers of nanoparticles and confound the measurement of the process-related exposure. Conclusion: The surveys showed that nanoparticles applications exist in many industrial sectors in Switzerland and that some companies already use high quantities of them. The representative survey demonstrated a low prevalence of nanoparticle usage in most branches of the Swiss industry and led to the conclusion that the introduction of applications using nanoparticles (especially outside industrial chemistry) is only beginning. Even though the number of potentially exposed workers was reportedly rather small, it nevertheless underscores the need for exposure assessments. Understanding exposure and how to measure it correctly is very important because the potential health effects of nanornaterials are not yet fully understood. The evaluation showed that many devices and methods of measuring nanoparticles need to be validated for nanoparticles agglomerates before large exposure assessment studies can begin. Zusammenfassung : Das Gesundheitsrisiko von Nanopartikel am Arbeitsplatz ist die Wahrscheinlichkeit dass ein Arbeitnehmer einen möglichen Gesundheitsschaden erleidet wenn er diesem Stoff ausgesetzt ist: sie wird gewöhnlich als Produkt von Schaden mal Exposition gerechnet. Für eine gründliche Abklärung möglicher Risiken von Nanomaterialien müssen also auf der einen Seite Informationen über die Freisetzung von solchen Materialien in die Umwelt vorhanden sein und auf der anderen Seite solche über die Exposition von Arbeitnehmenden. Viele dieser Informationen werden heute noch nicht systematisch gesarnmelt und felilen daher für Risikoanalysen, Die Doktorarbeit hatte als Ziel, die Grundlagen zu schaffen für eine quantitative Schatzung der Exposition gegenüber Nanopartikel am Arbeitsplatz und die Methoden zu evaluieren die zur Messung einer solchen Exposition nötig sind. Die Studie sollte untersuchen, in welchem Ausmass Nanopartikel bereits in der Schweizer Industrie eingesetzt werden, wie viele Arbeitnehrner damit potentiel] in Kontakt komrrien ob die Messtechnologie für die nötigen Arbeitsplatzbelastungsmessungen bereits genügt, Die Studie folcussierte dabei auf Exposition gegenüber luftgetragenen Partikel, weil die Atmung als Haupteintrittspforte iïlr Partikel in den Körper angesehen wird. Die Doktorarbeit besteht baut auf drei Phasen auf eine qualitative Umfrage (Pilotstudie), eine repräsentative, schweizerische Umfrage und mehrere technische Stndien welche dem spezitischen Verständnis der Mëglichkeiten und Grenzen einzelner Messgeräte und - teclmikeri dienen. Die qualitative Telephonumfrage wurde durchgeführt als Vorstudie zu einer nationalen und repräsentativen Umfrage in der Schweizer Industrie. Sie zielte auf Informationen ab zum Vorkommen von Nanopartikeln, und den angewendeten Schutzmassnahmen. Die Studie bestand aus gezielten Telefoninterviews mit Arbeit- und Gesundheitsfachpersonen von Schweizer Unternehmen. Die Untemehmen wurden aufgrund von offentlich zugànglichen lnformationen ausgewählt die darauf hinwiesen, dass sie mit Nanopartikeln umgehen. Der zweite Teil der Dolctorarbeit war die repräsentative Studie zur Evalniernng der Verbreitnng von Nanopaitikelanwendungen in der Schweizer lndustrie. Die Studie baute auf lnformationen der Pilotstudie auf und wurde mit einer repräsentativen Selektion von Firmen der Schweizerischen Unfall Versicherungsanstalt (SUVA) durchgeüihxt. Die Mehrheit der Schweizerischen Unternehmen im lndustrieselctor wurde damit abgedeckt. Der dritte Teil der Doktorarbeit fokussierte auf die Methodik zur Messung von Nanopartikeln. Mehrere Vorstudien wurden dnrchgefîihrt, um die Grenzen von oft eingesetzten Nanopartikelmessgeräten auszuloten, wenn sie grösseren Mengen von Nanopartikel Agglomeraten ausgesetzt messen sollen. Dieser F okns wurde ans zwei Gründen gewählt: weil mehrere Dislcussionen rnit Anwendem und auch dem Produzent der Messgeràte dort eine Schwachstelle vermuten liessen, welche Zweifel an der Genauigkeit der Messgeräte aufkommen liessen und weil in den zwei Umfragestudien ein häufiges Vorkommen von solchen Nanopartikel-Agglomeraten aufgezeigt wurde. i Als erstes widmete sich eine Vorstndie der Genauigkeit des Scanning Mobility Particle Sizer (SMPS). Dieses Messgerät zeigte in Präsenz von Nanopartikel Agglorneraten unsinnige bimodale Partikelgrössenverteilung an. Eine Serie von kurzen Experimenten folgte, welche sich auf andere Messgeräte und deren Probleme beim Messen von Nanopartikel-Agglomeraten konzentrierten. Der Condensation Particle Counter (CPC), der portable aerosol spectrometer (PAS), ein Gerät zur Schàtzung des aerodynamischen Durchniessers von Teilchen, sowie der Diffusion Size Classifier wurden getestet. Einige erste Machbarkeitstests zur Ermittlnng der Effizienz von tilterbasierter Messung von luftgetragenen Carbon Nanotubes (CNT) wnrden als letztes durchgeiührt. Die Pilotstudie hat ein detailliiertes Bild der Typen und Mengen von genutzten Nanopartikel in Schweizer Unternehmen geliefert, und hat den Stand des Wissens der interviewten Gesundheitsschntz und Sicherheitsfachleute aufgezeigt. Folgende Typen von Nanopaitikeln wurden von den kontaktierten Firmen als Maximalmengen angegeben (> 1'000 kg pro Jahr / Unternehrnen): Ag, Al-Ox, Fe-Ox, SiO2, TiO2, und ZnO (hauptsächlich Nanopartikel der ersten Generation). Die Quantitäten von eingesetzten Nanopartikeln waren stark verschieden mit einem ein Median von 100 kg pro Jahr. ln der quantitativen Fragebogenstudie wurden l'626 Unternehmen brieflich kontaktiert; allesamt Klienten der Schweizerischen Unfallversicherringsanstalt (SUVA). Die Resultate der Umfrage erlaubten eine Abschätzung der Anzahl von Unternehmen und Arbeiter, welche Nanopartikel in der Schweiz anwenden. Die Hochrechnung auf den Schweizer lndnstriesektor hat folgendes Bild ergeben: ln 586 Unternehmen (95% Vertrauensintervallz 145 bis 1'027 Unternehmen) sind 1'309 Arbeiter potentiell gegenüber Nanopartikel exponiert (95%-Vl: l'073 bis l'545). Diese Zahlen stehen für 0.6% der Schweizer Unternehmen (95%-Vl: 0.2% bis 1.1%) und 0.08% der Arbeiternehmerschaft (95%-V1: 0.06% bis 0.09%). Es gibt einige gut etablierte Technologien um die Luftkonzentration von Submikrometerpartikel zu messen. Es besteht jedoch Zweifel daran, inwiefern sich diese Technologien auch für die Messurrg von künstlich hergestellten Nanopartikeln verwenden lassen. Aus diesem Grund folcussierten die vorbereitenden Studien für die Arbeitsplatzbeurteilnngen auf die Messung von Pulverri, welche Nan0partike1-Agg10merate enthalten. Sie erlaubten die ldentifikation folgender rnöglicher Quellen von fehlerhaften Messungen an Arbeitsplätzen mit erhöhter Luft-K0nzentrati0n von Nanopartikel Agglomeratenz - Ein Standard SMPS zeigte eine unglaubwürdige bimodale Partikelgrössenverteilung wenn er grössere Nan0par'til<e1Agg10merate gemessen hat. - Grosse Unterschiede im Bereich von Faktor tausend wurden festgestellt zwischen einem Diffusion Size Classiîier und einigen CPC (beziehungsweise dem SMPS). - Die Unterschiede zwischen CPC/SMPS und dem PAS waren geringer, aber abhängig von Grosse oder Typ des gemessenen Pulvers waren sie dennoch in der Grössenordnung von einer guten Grössenordnung. - Spezifische Schwierigkeiten und Unsicherheiten im Bereich von Arbeitsplatzmessungen wurden identitiziert: Hintergrundpartikel können mit Partikeln interagieren die während einem Arbeitsprozess freigesetzt werden. Solche Interaktionen erschweren eine korrekte Einbettung der Hintergrunds-Partikel-Konzentration in die Messdaten. - Elektromotoren produzieren grosse Mengen von Nanopartikeln und können so die Messung der prozessbezogenen Exposition stören. Fazit: Die Umfragen zeigten, dass Nanopartikel bereits Realitàt sind in der Schweizer Industrie und dass einige Unternehmen bereits grosse Mengen davon einsetzen. Die repräsentative Umfrage hat diese explosive Nachricht jedoch etwas moderiert, indem sie aufgezeigt hat, dass die Zahl der Unternehmen in der gesamtschweizerischen Industrie relativ gering ist. In den meisten Branchen (vor allem ausserhalb der Chemischen Industrie) wurden wenig oder keine Anwendungen gefunden, was schliessen last, dass die Einführung dieser neuen Technologie erst am Anfang einer Entwicklung steht. Auch wenn die Zahl der potentiell exponierten Arbeiter immer noch relativ gering ist, so unterstreicht die Studie dennoch die Notwendigkeit von Expositionsmessungen an diesen Arbeitsplätzen. Kenntnisse um die Exposition und das Wissen, wie solche Exposition korrekt zu messen, sind sehr wichtig, vor allem weil die möglichen Auswirkungen auf die Gesundheit noch nicht völlig verstanden sind. Die Evaluation einiger Geräte und Methoden zeigte jedoch, dass hier noch Nachholbedarf herrscht. Bevor grössere Mess-Studien durgefîihrt werden können, müssen die Geräte und Methodem für den Einsatz mit Nanopartikel-Agglomeraten validiert werden.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: A 70-gene signature was previously shown to have prognostic value in patients with node-negative breast cancer. Our goal was to validate the signature in an independent group of patients. METHODS: Patients (n = 307, with 137 events after a median follow-up of 13.6 years) from five European centers were divided into high- and low-risk groups based on the gene signature classification and on clinical risk classifications. Patients were assigned to the gene signature low-risk group if their 5-year distant metastasis-free survival probability as estimated by the gene signature was greater than 90%. Patients were assigned to the clinicopathologic low-risk group if their 10-year survival probability, as estimated by Adjuvant! software, was greater than 88% (for estrogen receptor [ER]-positive patients) or 92% (for ER-negative patients). Hazard ratios (HRs) were estimated to compare time to distant metastases, disease-free survival, and overall survival in high- versus low-risk groups. RESULTS: The 70-gene signature outperformed the clinicopathologic risk assessment in predicting all endpoints. For time to distant metastases, the gene signature yielded HR = 2.32 (95% confidence interval [CI] = 1.35 to 4.00) without adjustment for clinical risk and hazard ratios ranging from 2.13 to 2.15 after adjustment for various estimates of clinical risk; clinicopathologic risk using Adjuvant! software yielded an unadjusted HR = 1.68 (95% CI = 0.92 to 3.07). For overall survival, the gene signature yielded an unadjusted HR = 2.79 (95% CI = 1.60 to 4.87) and adjusted hazard ratios ranging from 2.63 to 2.89; clinicopathologic risk yielded an unadjusted HR = 1.67 (95% CI = 0.93 to 2.98). For patients in the gene signature high-risk group, 10-year overall survival was 0.69 for patients in both the low- and high-clinical risk groups; for patients in the gene signature low-risk group, the 10-year survival rates were 0.88 and 0.89, respectively. CONCLUSIONS: The 70-gene signature adds independent prognostic information to clinicopathologic risk assessment for patients with early breast cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Executive Summary The unifying theme of this thesis is the pursuit of a satisfactory ways to quantify the riskureward trade-off in financial economics. First in the context of a general asset pricing model, then across models and finally across country borders. The guiding principle in that pursuit was to seek innovative solutions by combining ideas from different fields in economics and broad scientific research. For example, in the first part of this thesis we sought a fruitful application of strong existence results in utility theory to topics in asset pricing. In the second part we implement an idea from the field of fuzzy set theory to the optimal portfolio selection problem, while the third part of this thesis is to the best of our knowledge, the first empirical application of some general results in asset pricing in incomplete markets to the important topic of measurement of financial integration. While the first two parts of this thesis effectively combine well-known ways to quantify the risk-reward trade-offs the third one can be viewed as an empirical verification of the usefulness of the so-called "good deal bounds" theory in designing risk-sensitive pricing bounds. Chapter 1 develops a discrete-time asset pricing model, based on a novel ordinally equivalent representation of recursive utility. To the best of our knowledge, we are the first to use a member of a novel class of recursive utility generators to construct a representative agent model to address some long-lasting issues in asset pricing. Applying strong representation results allows us to show that the model features countercyclical risk premia, for both consumption and financial risk, together with low and procyclical risk free rate. As the recursive utility used nests as a special case the well-known time-state separable utility, all results nest the corresponding ones from the standard model and thus shed light on its well-known shortcomings. The empirical investigation to support these theoretical results, however, showed that as long as one resorts to econometric methods based on approximating conditional moments with unconditional ones, it is not possible to distinguish the model we propose from the standard one. Chapter 2 is a join work with Sergei Sontchik. There we provide theoretical and empirical motivation for aggregation of performance measures. The main idea is that as it makes sense to apply several performance measures ex-post, it also makes sense to base optimal portfolio selection on ex-ante maximization of as many possible performance measures as desired. We thus offer a concrete algorithm for optimal portfolio selection via ex-ante optimization over different horizons of several risk-return trade-offs simultaneously. An empirical application of that algorithm, using seven popular performance measures, suggests that realized returns feature better distributional characteristics relative to those of realized returns from portfolio strategies optimal with respect to single performance measures. When comparing the distributions of realized returns we used two partial risk-reward orderings first and second order stochastic dominance. We first used the Kolmogorov Smirnov test to determine if the two distributions are indeed different, which combined with a visual inspection allowed us to demonstrate that the way we propose to aggregate performance measures leads to portfolio realized returns that first order stochastically dominate the ones that result from optimization only with respect to, for example, Treynor ratio and Jensen's alpha. We checked for second order stochastic dominance via point wise comparison of the so-called absolute Lorenz curve, or the sequence of expected shortfalls for a range of quantiles. As soon as the plot of the absolute Lorenz curve for the aggregated performance measures was above the one corresponding to each individual measure, we were tempted to conclude that the algorithm we propose leads to portfolio returns distribution that second order stochastically dominates virtually all performance measures considered. Chapter 3 proposes a measure of financial integration, based on recent advances in asset pricing in incomplete markets. Given a base market (a set of traded assets) and an index of another market, we propose to measure financial integration through time by the size of the spread between the pricing bounds of the market index, relative to the base market. The bigger the spread around country index A, viewed from market B, the less integrated markets A and B are. We investigate the presence of structural breaks in the size of the spread for EMU member country indices before and after the introduction of the Euro. We find evidence that both the level and the volatility of our financial integration measure increased after the introduction of the Euro. That counterintuitive result suggests the presence of an inherent weakness in the attempt to measure financial integration independently of economic fundamentals. Nevertheless, the results about the bounds on the risk free rate appear plausible from the view point of existing economic theory about the impact of integration on interest rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evolution of continuous traits is the central component of comparative analyses in phylogenetics, and the comparison of alternative models of trait evolution has greatly improved our understanding of the mechanisms driving phenotypic differentiation. Several factors influence the comparison of models, and we explore the effects of random errors in trait measurement on the accuracy of model selection. We simulate trait data under a Brownian motion model (BM) and introduce different magnitudes of random measurement error. We then evaluate the resulting statistical support for this model against two alternative models: Ornstein-Uhlenbeck (OU) and accelerating/decelerating rates (ACDC). Our analyses show that even small measurement errors (10%) consistently bias model selection towards erroneous rejection of BM in favour of more parameter-rich models (most frequently the OU model). Fortunately, methods that explicitly incorporate measurement errors in phylogenetic analyses considerably improve the accuracy of model selection. Our results call for caution in interpreting the results of model selection in comparative analyses, especially when complex models garner only modest additional support. Importantly, as measurement errors occur in most trait data sets, we suggest that estimation of measurement errors should always be performed during comparative analysis to reduce chances of misidentification of evolutionary processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dyspnea and chest pain are typical reasons for consultations. biomarkers (CRP, procalcitonin, NT-proBNP, troponins, D-dimers) can have an interest for the diagnosis, the prognosis and the follow-up of several pathologies. There are however numerous pitfalls and limitations between the discovery of a biomarker and the utility in clinical practice. It is essential to always estimate a pre-test probability based on an attentive history and a careful physical examination, to know the intrinsic and extrinsic qualities of a test, and to determine a threshold of care. A biomarker should be used only if it modifies the patient's care and if it brings him a benefit compared to the patient who has no biomarker.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetically constructed microbial biosensors for measuring organic pollutants are mostly applied in aqueous samples. Unfortunately, the detection limit of most biosensors is insufficient to detect pollutants at low but environmentally relevant concentrations. However, organic pollutants with low levels of water solubility often have significant gas-water partitioning coefficients, which in principle makes it possible to measure such compounds in the gas rather than the aqueous phase. Here we describe the first use of a microbial biosensor for measuring organic pollutants directly in the gas phase. For this purpose, we reconstructed a bioluminescent Pseudomonas putida naphthalene biosensor strain to carry the NAH7 plasmid and a chromosomally inserted gene fusion between the sal promoter and the luxAB genes. Specific calibration studies were performed with suspended and filter-immobilized biosensor cells, in aqueous solution and in the gas phase. Gas phase measurements with filter-immobilized biosensor cells in closed flasks, with a naphthalene-contaminated aqueous phase, showed that the biosensor cells can measure naphthalene effectively. The biosensor cells on the filter responded with increasing light output proportional to the naphthalene concentration added to the water phase, even though only a small proportion of the naphthalene was present in the gas phase. In fact, the biosensor cells could concentrate a larger proportion of naphthalene through the gas phase than in the aqueous suspension, probably due to faster transport of naphthalene to the cells in the gas phase. This led to a 10-fold lower detectable aqueous naphthalene concentration (50 nM instead of 0.5 micro M). Thus, the use of bacterial biosensors for measuring organic pollutants in the gas phase is a valid method for increasing the sensitivity of these valuable biological devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The renal enzyme renin cleaves from the hepatic alpha(2)-globulin angiotensinogen angiotensin-(1-10) decapeptide [Ang-(1-10)], which is further metabolized to smaller peptides that help maintain cardiovascular homeostasis. The Ang-(1-7) heptapeptide has been reported to have several physiological effects, including natriuresis, diuresis, vasodilation, and release of vasopressin and prostaglandins. METHODS: To investigate Ang-(1-7) in clinical settings, we developed a method to measure immunoreactive (ir-) Ang-(1-7) in 2 mL of human blood and to estimate plasma concentrations by correcting for the hematocrit. A sensitive and specific antiserum against Ang-(1-7) was raised in a rabbit. Human blood was collected in the presence of an inhibitor mixture including a renin inhibitor to prevent peptide generation in vitro. Ang-(1-7) was extracted into ethanol and purified on phenylsilylsilica. The peptide was quantified by radioimmunoassay. Increasing doses of Ang-(1-7) were infused into volunteers, and plasma concentrations of the peptide were measured. RESULTS: The detection limit for plasma ir-Ang-(1-7) was 1 pmol/L. CVs for high and low blood concentrations were 4% and 20%, respectively, and between-assay CVs were 8% and 13%, respectively. Reference values for human plasma concentrations of ir-Ang-(1-7) were 1.0-9.5 pmol/L (median, 4.7 pmol/L) and increased linearly during infusion of increasing doses of Ang-(1-7). CONCLUSIONS: Reliable measurement of plasma ir-Ang-(1-7) is achieved with efficient inhibition of enzymes that generate or metabolize Ang-(1-7) after blood sampling, extraction in ethanol, and purification on phenylsilylsilica, and by use of a specific antiserum.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High performance liquid chromatography (HPLC) is the reference method for measuring concentrations of antimicrobials in blood. This technique requires careful sample preparation. Protocols using organic solvents and/or solid extraction phases are time consuming and entail several manipulations, which can lead to partial loss of the determined compound and increased analytical variability. Moreover, to obtain sufficient material for analysis, at least 1 ml of plasma is required. This constraint makes it difficult to determine drug levels when blood sample volumes are limited. However, drugs with low plasma-protein binding can be reliably extracted from plasma by ultra-filtration with a minimal loss due to the protein-bound fraction. This study validated a single-step ultra-filtration method for extracting fluconazole (FLC), a first-line antifungal agent with a weak plasma-protein binding, from plasma to determine its concentration by HPLC. Spiked FLC standards and unknowns were prepared in human and rat plasma. Samples (240 microl) were transferred into disposable microtube filtration units containing cellulose or polysulfone filters with a 5 kDa cut-off. After centrifugation for 60 min at 15000g, FLC concentrations were measured by direct injection of the filtrate into the HPLC. Using cellulose filters, low molecular weight proteins were eluted early in the chromatogram and well separated from FLC that eluted at 8.40 min as a sharp single peak. In contrast, with polysulfone filters several additional peaks interfering with the FLC peak were observed. Moreover, the FLC recovery using cellulose filters compared to polysulfone filters was higher and had a better reproducibility. Cellulose filters were therefore used for the subsequent validation procedure. The quantification limit was 0.195 mgl(-1). Standard curves with a quadratic regression coefficient &gt; or = 0.9999 were obtained in the concentration range of 0.195-100 mgl(-1). The inter and intra-run accuracies and precisions over the clinically relevant concentration range, 1.875-60 mgl(-1), fell well within the +/-15% variation recommended by the current guidelines for the validation of analytical methods. Furthermore, no analytical interference was observed with commonly used antibiotics, antifungals, antivirals and immunosuppressive agents. Ultra-filtration of plasma with cellulose filters permits the extraction of FLC from small volumes (240 microl). The determination of FLC concentrations by HPLC after this single-step procedure is selective, precise and accurate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Positional therapy that prevents patients from sleeping supine has been used for many years to manage positional obstructive sleep apnea (OSA). However, patients' usage at home and the long term efficacy of this therapy have never been objectively assessed. METHODS: Sixteen patients with positional OSA who refused or could not tolerate continuous positive airway pressure (CPAP) were enrolled after a test night study (T0) to test the efficacy of the positional therapy device. The patients who had a successful test night were instructed to use the device every night for three months. Nightly usage was monitored by an actigraphic recorder placed inside the positional device. A follow-up night study (T3) was performed after three months of positional therapy. RESULTS: Patients used the device on average 73.7 ± 29.3% (mean ± SD) of the nights for 8.0 ± 2.0 h/night. 10/16 patients used the device more than 80% of the nights. Compared to the baseline (diagnostic) night, mean apnea-hypopnea index (AHI) decreased from 26.7 ± 17.5 to 6.0 ± 3.4 with the positional device (p<0.0001) during T0 night. Oxygen desaturation (3%) index also fell from 18.4 ± 11.1 to 7.1 ± 5.7 (p = 0.001). Time spent supine fell from 42.8 ± 26.2% to 5.8 ± 7.2% (p < 0.0001). At three months (T3), the benefits persisted with no difference in AHI (p = 0.58) or in time spent supine (p = 0.98) compared to T0 night. The Epworth sleepiness scale showed a significant decrease from 9.4 ± 4.5 to 6.6 ± 4.7 (p = 0.02) after three months. CONCLUSIONS: Selected patients with positional OSA can be effectively treated by a positional therapy with an objective compliance of 73.7% of the nights and a persistent efficacy after three months.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To assess the cost-utility of an exercise programme vs usual care after functional multidisciplinary rehabilitation in patients with chronic low back pain. DESIGN: Cost-utility analysis alongside a randomized controlled trial. SUBJECTS/PATIENTS: A total of 105 patients with chronic low back pain. METHODS: Chronic low back pain patients completing a 3-week functional multidisciplinary rehabilitation were randomized to either a 3-month exercise programme (n = 56) or usual care (n = 49). The exercise programme consisted of 24 training sessions during 12 weeks. At the end of functional multidisciplinary rehabilitation and at 1-year follow-up quality of life was measured with the SF-36 questionnaire, converted into utilities and transformed into quality--adjusted life years. Direct and indirect monthly costs were measured using cost diaries. The incremental cost-effectiveness ratio was calculated as the incremental cost of the exercise programme divided by the difference in quality-adjusted life years between both groups. RESULTS: Quality of life improved significantly at 1-year follow-up in both groups. Similarly, both groups significantly reduced total monthly costs over time. No significant difference was observed between groups. The incremental cost-effectiveness ratio was 79,270 euros. CONCLUSION: Adding an exercise programme after functional multidisciplinary rehabilitation compared with usual care does not offer significant long-term benefits in quality of life and direct and indirect costs.