8 resultados para Blood Alcohol Test Equipment.

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis focuses on the issue of testing sleepiness quantitatively. The issue is relevant to policymakers concerned with traffic- and occupational safety; such testing provides a tool for safety legislation and -surveillance. The findings of this thesis provide guidelines for a posturographic sleepiness tester. Sleepiness ensuing from staying awake merely 17 h impairs our performance as much as the legally proscribed blood alcohol concentration 0.5 does. Hence, sleepiness is a major risk factor in transportation and occupational accidents. The lack of convenient, commercial sleepiness tests precludes testing impending sleepiness levels contrary to simply breath testing for alcohol intoxication. Posturography is a potential sleepiness test, since clinical diurnal balance testing suggests the hypothesis that time awake could be posturographically estimable. Relying on this hypothesis this thesis examines posturographic sleepiness testing for instrumentation purposes. Empirical results from 63 subjects for whom we tested balance with a force platform during wakefulness for maximum 36 h show that sustained wakefulness impairs balance. The results show that time awake is posturographically estimable with 88% accuracy and 97% precision which validates our hypothesis. Results also show that balance scores tested at 13:30 hours serve as a threshold to detect excessive sleepiness. Analytical results show that the test length has a marked effect on estimation accuracy: 18 s tests suffice to identify sleepiness related balance changes, but trades off some of the accuracy achieved with 30 s tests. The procedure to estimate time awake relies on equating the subject s test score to a reference table (comprising balance scores tested during sustained wakefulness, regressed against time awake). Empirical results showed that sustained wakefulness explains 60% of the diurnal balance variations, whereas the time of day explains 40% of the balance variations. The latter fact implies that time awake estimations also must rely on knowing the local times of both test and reference scores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to examine the trends, incidence and recidivism of drunken driving during a 20-year period (1988 - 2007) using the data on all suspected drunken driving in this period. Furthermore, the association between social background and drunken driving, and the mortality of drunk drivers were studied by using administrative register data provided by Statistics Finland. The study was completely register-based. In 1989 - 1991, every year 30,000 drivers were suspected of drunken driving, but the number fell to less than 20,000 by 1994, during the economic recession. The changes in the arrest incidence of the youngest age groups were especially pronounced, most of all in the age group of 18 - 19-year olds. Even though the incidence among youth decreased dramatically, their incidence rate was still twice that of the general population aged 15 - 84 years. Drunken driving was associated with a poor social background among youth and working-aged men and women. For example, a low level of education, unemployment, divorce, and parental factors in youth were associated with a higher risk of being arrested for drunken driving. While a low income was related to more drunken driving among working-aged people, the effect among young persons was the opposite. Every third drunk driver got rearrested during a 15-year period, whereas the estimated rearrest rate was 44%. Findings of drugs only or in combination with alcohol increased the risk of rearrest. The highest rearrest rates were seen among drivers who were under the influence of amphetamines or cannabis. Also male gender, young age, high blood alcohol concentration, and arrest during weekdays and in the daytime predicted rearrest. When compared to the general population, arrested drunk drivers had significant excess mortality. The greatest relative differences were seen in alcohol-related causes of death (including alcohol diseases and alcohol poisoning), accidents, suicides and violence. Also mortality due to other than alcohol-related diseases was elevated among drunk drivers. Drunken driving was associated with multiple factors linked to traffic safety, health and social problems. Social marginalization may expose a person to harmful use of alcohol and drunken driving, and the associations are seen already among the youth. Recidivism is common among drunk drivers, and driving under the influence of illicit and/or medicinal drugs is likely to indicate worse substance abuse problems, judging from the high rearrest rates. High alcohol-related mortality in this population shows that drunken driving is clearly an indicator of alcohol abuse. More effective measures of preventing alcohol-related harms are needed, than merely preventing convicted drunk drivers from driving again.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Various reasons, such as ethical issues in maintaining blood resources, growing costs, and strict requirements for safe blood, have increased the pressure for efficient use of resources in blood banking. The competence of blood establishments can be characterized by their ability to predict the volume of blood collection to be able to provide cellular blood components in a timely manner as dictated by hospital demand. The stochastically varying clinical need for platelets (PLTs) sets a specific challenge for balancing supply with requests. Labour has been proven a primary cost-driver and should be managed efficiently. International comparisons of blood banking could recognize inefficiencies and allow reallocation of resources. Seventeen blood centres from 10 countries in continental Europe, Great Britain, and Scandinavia participated in this study. The centres were national institutes (5), parts of the local Red Cross organisation (5), or integrated into university hospitals (7). This study focused on the departments of blood component preparation of the centres. The data were obtained retrospectively by computerized questionnaires completed via Internet for the years 2000-2002. The data were used in four original articles (numbered I through IV) that form the basis of this thesis. Non-parametric data envelopment analysis (DEA, II-IV) was applied to evaluate and compare the relative efficiency of blood component preparation. Several models were created using different input and output combinations. The focus of comparisons was on the technical efficiency (II-III) and the labour efficiency (I, IV). An empirical cost model was tested to evaluate the cost efficiency (IV). Purchasing power parities (PPP, IV) were used to adjust the costs of the working hours and to make the costs comparable among countries. The total annual number of whole blood (WB) collections varied from 8,880 to 290,352 in the centres (I). Significant variation was also observed in the annual volume of produced red blood cells (RBCs) and PLTs. The annual number of PLTs produced by any method varied from 2,788 to 104,622 units. In 2002, 73% of all PLTs were produced by the buffy coat (BC) method, 23% by aphaeresis and 4% by the platelet-rich plasma (PRP) method. The annual discard rate of PLTs varied from 3.9% to 31%. The mean discard rate (13%) remained in the same range throughout the study period and demonstrated similar levels and variation in 2003-2004 according to a specific follow-up question (14%, range 3.8%-24%). The annual PLT discard rates were, to some extent, associated with production volumes. The mean RBC discard rate was 4.5% (range 0.2%-7.7%). Technical efficiency showed marked variation (median 60%, range 41%-100%) among the centres (II). Compared to the efficient departments, the inefficient departments used excess labour resources (and probably) production equipment to produce RBCs and PLTs. Technical efficiency tended to be higher when the (theoretical) proportion of lost WB collections (total RBC+PLT loss) from all collections was low (III). The labour efficiency varied remarkably, from 25% to 100% (median 47%) when working hours were the only input (IV). Using the estimated total costs as the input (cost efficiency) revealed an even greater variation (13%-100%) and overall lower efficiency level compared to labour only as the input. In cost efficiency only, the savings potential (observed inefficiency) was more than 50% in 10 departments, whereas labour and cost savings potentials were both more than 50% in six departments. The association between department size and efficiency (scale efficiency) could not be verified statistically in the small sample. In conclusion, international evaluation of the technical efficiency in component preparation departments revealed remarkable variation. A suboptimal combination of manpower and production output levels was the major cause of inefficiency, and the efficiency did not directly relate to production volume. Evaluation of the reasons for discarding components may offer a novel approach to study efficiency. DEA was proven applicable in analyses including various factors as inputs and outputs. This study suggests that analytical models can be developed to serve as indicators of technical efficiency and promote improvements in the management of limited resources. The work also demonstrates the importance of integrating efficiency analysis into international comparisons of blood banking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Placental abruption, one of the most significant causes of perinatal mortality and maternal morbidity, occurs in 0.5-1% of pregnancies. Its etiology is unknown, but defective trophoblastic invasion of the spiral arteries and consequent poor vascularization may play a role. The aim of this study was to define the prepregnancy risk factors of placental abruption, to define the risk factors during the index pregnancy, and to describe the clinical presentation of placental abruption. We also wanted to find a biochemical marker for predicting placental abruption early in pregnancy. Among women delivering at the University Hospital of Helsinki in 1997-2001 (n=46,742), 198 women with placental abruption and 396 control women were identified. The overall incidence of placental abruption was 0.42%. The prepregnancy risk factors were smoking (OR 1.7; 95% CI 1.1, 2.7), uterine malformation (OR 8.1; 1.7, 40), previous cesarean section (OR 1.7; 1.1, 2.8), and history of placental abruption (OR 4.5; 1.1, 18). The risk factors during the index pregnancy were maternal (adjusted OR 1.8; 95% CI 1.1, 2.9) and paternal smoking (2.2; 1.3, 3.6), use of alcohol (2.2; 1.1, 4.4), placenta previa (5.7; 1.4, 23.1), preeclampsia (2.7; 1.3, 5.6) and chorioamnionitis (3.3; 1.0, 10.0). Vaginal bleeding (70%), abdominal pain (51%), bloody amniotic fluid (50%) and fetal heart rate abnormalities (69%) were the most common clinical manifestations of placental abruption. Retroplacental blood clot was seen by ultrasound in 15% of the cases. Neither bleeding nor pain was present in 19% of the cases. Overall, 59% went into preterm labor (OR 12.9; 95% CI 8.3, 19.8), and 91% were delivered by cesarean section (34.7; 20.0, 60.1). Of the newborns, 25% were growth restricted. The perinatal mortality rate was 9.2% (OR 10.1; 95% CI 3.4, 30.1). We then tested selected biochemical markers for prediction of placental abruption. The median of the maternal serum alpha-fetoprotein (MSAFP) multiples of median (MoM) (1.21) was significantly higher in the abruption group (n=57) than in the control group (n=108) (1.07) (p=0.004) at 15-16 gestational weeks. In multivariate analysis, elevated MSAFP remained as an independent risk factor for placental abruption, adjusting for parity ≥ 3, smoking, previous placental abruption, preeclampsia, bleeding in II or III trimester, and placenta previa. MSAFP ≥ 1.5 MoM had a sensitivity of 29% and a false positive rate of 10%. The levels of the maternal serum free beta human chorionic gonadotrophin MoM did not differ between the cases and the controls. None of the angiogenic factors (soluble endoglin, soluble fms-like tyrosine kinase 1, or placental growth factor) showed any difference between the cases (n=42) and the controls (n=50) in the second trimester. The levels of C-reactive protein (CRP) showed no difference between the cases (n=181) and the controls (n=261) (median 2.35 mg/l [interquartile range {IQR} 1.09-5.93] versus 2.28 mg/l [IQR 0.92-5.01], not significant) when tested in the first trimester (mean 10.4 gestational weeks). Chlamydia pneumoniae specific immunoglobulin G (IgG) and immunoglobulin A (IgA) as well as C. trachomatis specific IgG, IgA and chlamydial heat-shock protein 60 antibody rates were similar between the groups. In conclusion, although univariate analysis identified many prepregnancy risk factors for placental abruption, only smoking, uterine malformation, previous cesarean section and history of placental abruption remained significant by multivariate analysis. During the index pregnancy maternal alcohol consumption and smoking and smoking by the partner turned out to be the major independent risk factors for placental abruption. Smoking by both partners multiplied the risk. The liberal use of ultrasound examination contributed little to the management of women with placental abruption. Although second-trimester MSAFP levels were higher in women with subsequent placental abruption, clinical usefulness of this test is limited due to low sensitivity and high false positive rate. Similarly, angiogenic factors in early second trimester, or CRP levels, or chlamydial antibodies in the first trimester failed to predict placental abruption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Infection is a major cause of mortality and morbidity after thoracic organ transplantation. The aim of the present study was to evaluate the infectious complications after lung and heart transplantation, with a special emphasis on the usefulness of bronchoscopy and the demonstration of cytomegalovirus (CMV), human herpes virus (HHV)-6, and HHV-7. We reviewed all the consecutive bronchoscopies performed on heart transplant recipients (HTRs) from May 1988 to December 2001 (n = 44) and lung transplant recipients (LTRs) from February 1994 to November 2002 (n = 472). To compare different assays in the detection of CMV, a total of 21 thoracic organ transplant recipients were prospectively monitored by CMV pp65-antigenemia, DNAemia (PCR), and mRNAemia (NASBA) tests. The antigenemia test was the reference assay for therapeutic intervention. In addition to CMV antigenemia, 22 LTRs were monitored for HHV-6 and HHV-7 antigenemia. The diagnostic yield of the clinically indicated bronchoscopies was 41 % in the HTRs and 61 % in the LTRs. The utility of the bronchoscopy was highest from one to six months after transplantation. In contrast, the findings from the surveillance bronchoscopies performed on LTRs led to a change in the previous treatment in only 6 % of the cases. Pneumocystis carinii and CMV were the most commonly detected pathogens. Furthermore, 15 (65 %) of the P. carinii infections in the LTRs were detected during chemoprophylaxis. None of the complications of the bronchoscopies were fatal. Antigenemia, DNAemia, and mRNAemia were present in 98 %, 72 %, and 43 % of the CMV infections, respectively. The optimal DNAemia cut-off levels (sensitivity/specificity) were 400 (75.9/92.7 %), 850 (91.3/91.3 %), and 1250 (100/91.5 %) copies/ml for the antigenemia of 2, 5, and 10 pp65-positive leukocytes/50 000 leukocytes, respectively. The sensitivities of the NASBA were 25.9, 43.5, and 56.3 % in detecting the same cut-off levels. CMV DNAemia was detected in 93 % and mRNAemia in 61 % of the CMV antigenemias requiring antiviral therapy. HHV-6, HHV-7, and CMV antigenemia was detected in 20 (91 %), 11 (50 %), and 12 (55 %) of the 22 LTRs (median 16, 31, and 165 days), respectively. HHV-6 appeared in 15 (79 %), HHV-7 in seven (37 %), and CMV in one (7 %) of these patients during ganciclovir or valganciclovir prophylaxis. One case of pneumonitis and another of encephalitis were associated with HHV-6. In conclusion, bronchoscopy is a safe and useful diagnostic tool in LTRs and HTRs with a suspected respiratory infection, but the role of surveillance bronchoscopy in LTRs remains controversial. The PCR assay acts comparably with the antigenemia test in guiding the pre-emptive therapy against CMV when threshold levels of over 5 pp65-antigen positive leukocytes are used. In contrast, the low sensitivity of NASBA limits its usefulness. HHV-6 and HHV-7 activation is common after lung transplantation despite ganciclovir or valganciclovir prophylaxis, but clinical manifestations are infrequently linked to them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hypertension is one of the major risk factors for cardiovascular morbidity. The advantages of antihypertensive therapy have been clearly demonstrated, but only about 30% of hypertensive patients have their blood pressure (BP) controlled by such treatment. One of the reasons for this poor BP control may lie in the difficulty in predicting BP response to antihypertensive treatment. The average BP reduction achieved is similar for each drug in the main classes of antihypertensive agents, but there is a marked individual variation in BP responses to any given drug. The purpose of the present study was to examine BP response to four different antihypertensive monotherapies with regard to demographic characteristics, laboratory test results and common genetic polymorphisms. The subjects of the present study are participants in the pharmacogenetic GENRES Study. A total of 208 subjects completed the whole study protocol including four drug treatment periods of four weeks, separated by four-week placebo periods. The study drugs were amlodipine, bisoprolol, hydrochlorothiazide and losartan. Both office (OBP) and 24-hour ambulatory blood pressure (ABP) measurements were carried out. BP response to study drugs were related to basic clinical characteristics, pretreatment laboratory test results and common polymorphisms in genes coding for components of the renin-angiotensin system, alpha-adducin (ADD1), beta1-adrenergic receptor (ADRB1) and beta2-adrenergic receptor (ADRB2). Age was positively correlated with BP responses to amlodipine and with OBP and systolic ABP responses to hydrochlorothiazide, while body mass index was negatively correlated with ABP responses to amlodipine. Of the laboratory test results, plasma renin activity (PRA) correlated positively with BP responses to losartan, with ABP responses to bisoprolol, and negatively with ABP responses to hydrochlorothiazide. Uniquely to this study, it was found that serum total calcium level was negatively correlated with BP responses to amlodipine, whilst serum total cholesterol level was negatively correlated with ABP responses to amlodipine. There were no significant associations of angiotensin II type I receptor 1166A/C, angiotensin converting enzyme I/D, angiotensinogen Met235Thr, ADD1 Gly460Trp, ADRB1 Ser49Gly and Gly389Arg and ADRB2 Arg16Gly and Gln27Glu polymorphisms with BP responses to the study drugs. In conclusion, this study confirmed the relationship between pretreatment PRA levels and response to three classes of antihypertensive drugs. This study is the first to note a significant inverse relation between serum calcium level and responsiveness to a calcium channel blocker. However, this study could not replicate the observations that common polymorphisms in angiotensin II type I receptor, angiotensin converting enzyme, angiotensinogen, ADD1, ADRB1, or ADRB2 genes can predict BP response to antihypertensive drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The blood-brain barrier (BBB) is a unique barrier that strictly regulates the entry of endogenous substrates and xenobiotics into the brain. This is due to its tight junctions and the array of transporters and metabolic enzymes that are expressed. The determination of brain concentrations in vivo is difficult, laborious and expensive which means that there is interest in developing predictive tools of brain distribution. Predicting brain concentrations is important even in early drug development to ensure efficacy of central nervous system (CNS) targeted drugs and safety of non-CNS drugs. The literature review covers the most common current in vitro, in vivo and in silico methods of studying transport into the brain, concentrating on transporter effects. The consequences of efflux mediated by p-glycoprotein, the most widely characterized transporter expressed at the BBB, is also discussed. The aim of the experimental study was to build a pharmacokinetic (PK) model to describe p-glycoprotein substrate drug concentrations in the brain using commonly measured in vivo parameters of brain distribution. The possibility of replacing in vivo parameter values with their in vitro counterparts was also studied. All data for the study was taken from the literature. A simple 2-compartment PK model was built using the Stella™ software. Brain concentrations of morphine, loperamide and quinidine were simulated and compared with published studies. Correlation of in vitro measured efflux ratio (ER) from different studies was evaluated in addition to studying correlation between in vitro and in vivo measured ER. A Stella™ model was also constructed to simulate an in vitro transcellular monolayer experiment, to study the sensitivity of measured ER to changes in passive permeability and Michaelis-Menten kinetic parameter values. Interspecies differences in rats and mice were investigated with regards to brain permeability and drug binding in brain tissue. Although the PK brain model was able to capture the concentration-time profiles for all 3 compounds in both brain and plasma and performed fairly well for morphine, for quinidine it underestimated and for loperamide it overestimated brain concentrations. Because the ratio of concentrations in brain and blood is dependent on the ER, it is suggested that the variable values cited for this parameter and its inaccuracy could be one explanation for the failure of predictions. Validation of the model with more compounds is needed to draw further conclusions. In vitro ER showed variable correlation between studies, indicating variability due to experimental factors such as test concentration, but overall differences were small. Good correlation between in vitro and in vivo ER at low concentrations supports the possibility of using of in vitro ER in the PK model. The in vitro simulation illustrated that in the simulation setting, efflux is significant only with low passive permeability, which highlights the fact that the cell model used to measure ER must have low enough paracellular permeability to correctly mimic the in vivo situation.