973 resultados para toxic shock syndrome toxin 1


Relevância:

100.00% 100.00%

Publicador:

Resumo:

To systematically investigate putative causes of non-coronary high-sensitive troponin elevations in patients presenting to a tertiary care emergency department. In this cross-sectional analysis, patients who received serial measurements of high-sensitive troponin T between 1 August 2010 and 31 October 2012 at the Department of Emergency Medicine were included. The following putative causes were considered to be associated with non-acute coronary syndrome-related increases in high-sensitive troponin T: acute pulmonary embolism, renal insufficiency, aortic dissection, heart failure, peri-/myocarditis, strenuous exercise, rhabdomyolysis, cardiotoxic chemotherapy, high-frequency ablation therapy, defibrillator shocks, cardiac infiltrative disorders (e.g., amyloidosis), chest trauma, sepsis, shock, exacerbation of chronic obstructive pulmonary disease, and diabetic ketoacidosis. During the study period a total of 1,573 patients received serial measurements of high-sensitive troponin T. Of these, 175 patients were found to have acute coronary syndrome leaving 1,398 patients for inclusion in the study. In 222 (30 %) of patients, no putative cause described in the literature could be attributed to the elevation in high-sensitive troponin T observed. The most commonly encountered mechanism underlying the troponin T elevation was renal insufficiency that was present in 286 patients (57 %), followed by cerebral ischemia in 95 patients (19 %), trauma in 75 patients (15 %) and heart failure in 41 patients (8 %). Non-acute coronary syndrome-associated elevation of high-sensitive troponin T levels is commonly observed in the emergency department. Renal insufficiency and acute cerebral events are the most common conditions associated with high-sensitive troponin T elevation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Close similarities of various physiological parameters makes the pig one of the preferred animal models for the study of human diseases, especially those involving the cardiovascular system. Unfortunately, the use of pig models to study diseases such as viral hemorrhagic fevers and endotoxic shock syndrome have been hampered by the lack of the necessary immunological tools to measure important immunoregulatory cytokines such as tumor necrosis factor (TNF). Here we describe a TNF-bioassay which is based on the porcine kidney cell line PK(15). Compared to the widely used murine fibroblastoid cell line L929, the PK(15) cell line displays a 100-1000-fold higher sensitivity for porcine TNF-alpha, a higher sensitivity for human TNF-alpha, and a slightly lower sensitivity for murine TNF-alpha. Using a PK(15) bioassay we can detect recombinant TNF-alpha as well as cytotoxic activity in the supernatants of lipopolysaccharide (LPS)-activated porcine monocytes at high dilutions. This suggests that the sensitivity of the test should permit the detection of TNF in biological specimens such as pig serum.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION Vertigo and dizziness are common neurological symptoms in general practice. Most patients have benign peripheral vestibular disorders, but some have dangerous central causes. Recent research has shown that bedside oculomotor examinations accurately discriminate central from peripheral lesions in those with new, acute, continuous vertigo/dizziness with nausea/vomiting, gait unsteadiness, and nystagmus, known as the acute vestibular syndrome. CASE REPORT A 56-year-old man presented to the emergency department with acute vestibular syndrome for 1 week. The patient had no focal neurological symptoms or signs. The presence of direction-fixed, horizontal nystagmus suppressed by visual fixation without vertical ocular misalignment (skew deviation) was consistent with an acute peripheral vestibulopathy, but bilaterally normal vestibuloocular reflexes, confirmed by quantitative horizontal head impulse testing, strongly indicated a central localization. Because of a long delay in care, the patient left the emergency department without treatment. He returned 1 week later with progressive gait disturbance, limb ataxia, myoclonus, and new cognitive deficits. His subsequent course included a rapid neurological decline culminating in home hospice placement and death within 1 month. Magnetic resonance imaging revealed restricted diffusion involving the basal ganglia and cerebral cortex. Spinal fluid 14-3-3 protein was elevated. The rapidly progressive clinical course with dementia, ataxia, and myoclonus plus corroborative neuroimaging and spinal fluid findings confirmed a clinicoradiographic diagnosis of Creutzfeldt-Jacob disease. CONCLUSIONS To our knowledge, this is the first report of an initial presentation of Creutzfeldt-Jacob disease closely mimicking vestibular neuritis, expanding the known clinical spectrum of prion disease presentations. Despite the initial absence of neurological signs, the central lesion location was differentiated from a benign peripheral vestibulopathy at the first visit using simple bedside vestibular tests. Familiarity with these tests could help providers prevent initial misdiagnosis of important central disorders in patients presenting vertigo or dizziness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ATLS program by the American college of surgeons is probably the most important globally active training organization dedicated to improve trauma management. Detection of acute haemorrhagic shock belongs to the key issues in clinical practice and thus also in medical teaching. (In this issue of the journal William Schulz and Ian McConachrie critically review the ATLS shock classification Table 1), which has been criticized after several attempts of validation have failed [1]. The main problem is that distinct ranges of heart rate are related to ranges of uncompensated blood loss and that the heart rate decrease observed in severe haemorrhagic shock is ignored [2]. Table 1. Estimated blood loos based on patient's initial presentation (ATLS Students Course Manual, 9th Edition, American College of Surgeons 2012). Class I Class II Class III Class IV Blood loss ml Up to 750 750–1500 1500–2000 >2000 Blood loss (% blood volume) Up to 15% 15–30% 30–40% >40% Pulse rate (BPM) <100 100–120 120–140 >140 Systolic blood pressure Normal Normal Decreased Decreased Pulse pressure Normal or ↑ Decreased Decreased Decreased Respiratory rate 14–20 20–30 30–40 >35 Urine output (ml/h) >30 20–30 5–15 negligible CNS/mental status Slightly anxious Mildly anxious Anxious, confused Confused, lethargic Initial fluid replacement Crystalloid Crystalloid Crystalloid and blood Crystalloid and blood Table options In a retrospective evaluation of the Trauma Audit and Research Network (TARN) database blood loss was estimated according to the injuries in nearly 165,000 adult trauma patients and each patient was allocated to one of the four ATLS shock classes [3]. Although heart rate increased and systolic blood pressure decreased from class I to class IV, respiratory rate and GCS were similar. The median heart rate in class IV patients was substantially lower than the value of 140 min−1 postulated by ATLS. Moreover deterioration of the different parameters does not necessarily go parallel as suggested in the ATLS shock classification [4] and [5]. In all these studies injury severity score (ISS) and mortality increased with in increasing shock class [3] and with increasing heart rate and decreasing blood pressure [4] and [5]. This supports the general concept that the higher heart rate and the lower blood pressure, the sicker is the patient. A prospective study attempted to validate a shock classification derived from the ATLS shock classes [6]. The authors used a combination of heart rate, blood pressure, clinically estimated blood loss and response to fluid resuscitation to classify trauma patients (Table 2) [6]. In their initial assessment of 715 predominantly blunt trauma patients 78% were classified as normal (Class 0), 14% as Class I, 6% as Class II and only 1% as Class III and Class IV respectively. This corresponds to the results from the previous retrospective studies [4] and [5]. The main endpoint used in the prospective study was therefore presence or absence of significant haemorrhage, defined as chest tube drainage >500 ml, evidence of >500 ml of blood loss in peritoneum, retroperitoneum or pelvic cavity on CT scan or requirement of any blood transfusion >2000 ml of crystalloid. Because of the low prevalence of class II or higher grades statistical evaluation was limited to a comparison between Class 0 and Class I–IV combined. As in the retrospective studies, Lawton did not find a statistical difference of heart rate and blood pressure among the five groups either, although there was a tendency to a higher heart rate in Class II patients. Apparently classification during primary survey did not rely on vital signs but considered the rather soft criterion of “clinical estimation of blood loss” and requirement of fluid substitution. This suggests that allocation of an individual patient to a shock classification was probably more an intuitive decision than an objective calculation the shock classification. Nevertheless it was a significant predictor of ISS [6]. Table 2. Shock grade categories in prospective validation study (Lawton, 2014) [6]. Normal No haemorrhage Class I Mild Class II Moderate Class III Severe Class IV Moribund Vitals Normal Normal HR > 100 with SBP >90 mmHg SBP < 90 mmHg SBP < 90 mmHg or imminent arrest Response to fluid bolus (1000 ml) NA Yes, no further fluid required Yes, no further fluid required Requires repeated fluid boluses Declining SBP despite fluid boluses Estimated blood loss (ml) None Up to 750 750–1500 1500–2000 >2000 Table options What does this mean for clinical practice and medical teaching? All these studies illustrate the difficulty to validate a useful and accepted physiologic general concept of the response of the organism to fluid loss: Decrease of cardiac output, increase of heart rate, decrease of pulse pressure occurring first and hypotension and bradycardia occurring only later. Increasing heart rate, increasing diastolic blood pressure or decreasing systolic blood pressure should make any clinician consider hypovolaemia first, because it is treatable and deterioration of the patient is preventable. This is true for the patient on the ward, the sedated patient in the intensive care unit or the anesthetized patients in the OR. We will therefore continue to teach this typical pattern but will continue to mention the exceptions and pitfalls on a second stage. The shock classification of ATLS is primarily used to illustrate the typical pattern of acute haemorrhagic shock (tachycardia and hypotension) as opposed to the Cushing reflex (bradycardia and hypertension) in severe head injury and intracranial hypertension or to the neurogenic shock in acute tetraplegia or high paraplegia (relative bradycardia and hypotension). Schulz and McConachrie nicely summarize the various confounders and exceptions from the general pattern and explain why in clinical reality patients often do not present with the “typical” pictures of our textbooks [1]. ATLS refers to the pitfalls in the signs of acute haemorrhage as well: Advanced age, athletes, pregnancy, medications and pace makers and explicitly state that individual subjects may not follow the general pattern. Obviously the ATLS shock classification which is the basis for a number of questions in the written test of the ATLS students course and which has been used for decades probably needs modification and cannot be literally applied in clinical practice. The European Trauma Course, another important Trauma training program uses the same parameters to estimate blood loss together with clinical exam and laboratory findings (e.g. base deficit and lactate) but does not use a shock classification related to absolute values. In conclusion the typical physiologic response to haemorrhage as illustrated by the ATLS shock classes remains an important issue in clinical practice and in teaching. The estimation of the severity haemorrhage in the initial assessment trauma patients is (and was never) solely based on vital signs only but includes the pattern of injuries, the requirement of fluid substitution and potential confounders. Vital signs are not obsolete especially in the course of treatment but must be interpreted in view of the clinical context. Conflict of interest None declared. Member of Swiss national ATLS core faculty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Between 1999 and 2011, 4,178 suspected dengue cases in children less than 18 months of age were reported to the Centers for Disease Control and Prevention Dengue Branch in Puerto Rico. Of the 4,178, 813 were determined to be laboratory-positive and 737 laboratory-negative. Those remaining were either laboratory-indeterminate, not processed or positive for Leptospira . On average, 63 laboratory-positive cases were reported per year. Laboratory-positive cases had a median age of 8.5 months. Among these cases, the median age for those with dengue fever was 8.7 months and 7.9 months for dengue hemorrhagic fever. Clinical signs and symptoms indicative of dengue were greatest among laboratory-positive cases and included fever, rash, thrombocytopenia, bleeding manifestations, and petechiae. The most common symptoms among patients who were laboratory-negative were fever, nasal congestion, cough, diarrhea, and vomiting. Using the 1997 WHO guidelines, nearly 50% of the laboratory-positive cases met the case definition for dengue fever, and 61 of these were further determined to meet the case definition for dengue hemorrhagic fever. In comparison, 15% of laboratory-negative cases met the case definition for dengue fever and less than 1% for dengue hemorrhagic fever. None of the laboratory-positive or laboratory-negative cases met the criteria for dengue shock syndrome.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This project identified a novel family of six 66-68 residue peptides from the venom of two Australian funnel-web spiders, Hadronyche sp. 20 and H. infensa: Orchid Beach (Hexathelidae: Atracinae), that appear to undergo N- and/or C-terminal post-translational modifications and conform to an ancestral protein fold. These peptides all show significant amino acid sequence homology to atracotoxin-Hvf17 (ACTX-Hvf17), a non-toxic peptide isolated from the venom of H. versuta, and a variety of AVIT family proteins including mamba intestinal toxin 1 (MIT1) and its mammalian and piscine orthologs prokineticin 1 (PK1) and prokineticin 2 PK2). These AVIT family proteins target prokineticin receptors involved in the sensitization of nociceptors and gastrointestinal smooth muscle activation. Given their sequence homology to MITI, we have named these spider venom peptides the MIT-like atracotoxin (ACTX) family. Using isolated rat stomach fundus or guinea-pia ileum organ bath preparations we have shown that the prototypical ACTX-Hvf17, at concentrations up to 1 mu M, did not stimulate smooth muscle contractility, nor did it inhibit contractions induced by human PK1 (hPK1). The peptide also lacked activity on other isolated smooth muscle preparations including rat aorta. Furthermore, a FLIPR Ca2+ flux assay using HEK293 cells expressing prokineticin receptors showed that ACTX-Hvf17 fails to activate or block hPK1 or hPK2 receptors. Therefore, while the MIT-like ACTX family appears to adopt the ancestral disulfide-directed beta-hairpin protein fold of MIT1, a motif believed to be shared by other AVIT family peptides, variations in the amino acid sequence and surface charge result in a loss of activity on prokineticin receptors. (c) 2005 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dengue is an important vector-borne virus that infects on the order of 400 million individuals per year. Infection with one of the virus's four serotypes (denoted DENV-1 to 4) may be silent, result in symptomatic dengue 'breakbone' fever, or develop into the more severe dengue hemorrhagic fever/dengue shock syndrome (DHF/DSS). Extensive research has therefore focused on identifying factors that influence dengue infection outcomes. It has been well-documented through epidemiological studies that DHF is most likely to result from a secondary heterologous infection, and that individuals experiencing a DENV-2 or DENV-3 infection typically are more likely to present with more severe dengue disease than those individuals experiencing a DENV-1 or DENV-4 infection. However, a mechanistic understanding of how these risk factors affect disease outcomes, and further, how the virus's ability to evolve these mechanisms will affect disease severity patterns over time, is lacking. In the second chapter of my dissertation, I formulate mechanistic mathematical models of primary and secondary dengue infections that describe how the dengue virus interacts with the immune response and the results of this interaction on the risk of developing severe dengue disease. I show that only the innate immune response is needed to reproduce characteristic features of a primary infection whereas the adaptive immune response is needed to reproduce characteristic features of a secondary dengue infection. I then add to these models a quantitative measure of disease severity that assumes immunopathology, and analyze the effectiveness of virological indicators of disease severity. In the third chapter of my dissertation, I then statistically fit these mathematical models to viral load data of dengue patients to understand the mechanisms that drive variation in viral load. I specifically consider the roles that immune status, clinical disease manifestation, and serotype may play in explaining viral load variation observed across the patients. With this analysis, I show that there is statistical support for the theory of antibody dependent enhancement in the development of severe disease in secondary dengue infections and that there is statistical support for serotype-specific differences in viral infectivity rates, with infectivity rates of DENV-2 and DENV-3 exceeding those of DENV-1. In the fourth chapter of my dissertation, I integrate these within-host models with a vector-borne epidemiological model to understand the potential for virulence evolution in dengue. Critically, I show that dengue is expected to evolve towards intermediate virulence, and that the optimal virulence of the virus depends strongly on the number of serotypes that co-circulate. Together, these dissertation chapters show that dengue viral load dynamics provide insight into the within-host mechanisms driving differences in dengue disease patterns and that these mechanisms have important implications for dengue virulence evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To examine associations of public transport system accessibility with walking, obesity, metabolic syndrome and diabetes/impaired glucose regulation. Methods: Associations of public transport accessibility with self-reported walking for transport or recreation and measured biomarkers of chronic disease risk were estimated in 5241 adult residents of 42 randomly selected areas in Australia in 2004/05, drawn from the second wave of a population-based cohort study (AusDiab). Public transport accessibility was objectively measured using an adaptation of the Public Transport Accessibility Levels (PTAL) methodology, comprising both GIS derived spatial and temporal accessibility measures. Logistic regression models were adjusted for individual and environmental level covariates and clustering within areas. Results: Above median public transport accessibility was positively associated with a walking time of more than the median 90 min per week (OR=1.28, 95%CI 1.03, 1.60) and walking above the recommended 150 min per week (OR=1.35, 95%CI 1.11, 1.63). There were no associations of public transport accessibility with obesity (OR=1.05, 95%CI 0.85, 1.30), the metabolic syndrome (OR=1.09, 95%CI 0.91, 1.31) nor diabetes/impaired glucose regulation (OR=1.11, 95%CI 0.94, 1.30). Findings were similar for a subgroup reporting no vigorous recreational physical activity. Conclusions: In this Australian sample, public transport accessibility was positively associated with walking at recommended levels, including for people who are not otherwise vigorously active. Significance: Walking is crucial for increasing physical activity levels and population health, as well as maximising public transport system efficiency. Building evidence on public transport accessibility and walking will enable governments to exploit this important synergy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Hypertension is a major contributor to the global non-communicable disease burden. Family history is an important non-modifiable risk factor for hypertension. The present study aims to describe the influence of family history (FH) on hypertension prevalence and associated metabolic risk factors in a large cohort of South Asian adults, from a nationally representative sample from Sri Lanka. Methods A cross-sectional survey among 5,000 Sri Lankan adults, evaluating FH at the levels of parents, grandparents, siblings and children. A binary logistic regression analysis was performed in all patients with ‘presence of hypertension’ as dichotomous dependent variable and using family history in parents, grandparents, siblings and children as binary independent variables. The adjusted odds ratio controlling for confounders (age, gender, body mass index, diabetes, hyperlipidemia and physical activity) are presented below. Results In all adults the prevalence of hypertension was significantly higher in patients with a FH (29.3 %, n = 572/1951) than those without (24.4 %, n = 616/2530) (p < 0.001). Presence of a FH significantly increased the risk of hypertension (OR:1.29; 95 % CI:1.13-1.47), obesity (OR:1.36; 95 % CI: 1.27–1.45), central obesity (OR:1.30; 95 % CI 1.22–1.40) and metabolic syndrome (OR:1.19; 95 % CI: 1.08–1.30). In all adults presence of family history in parents (OR:1.28; 95 % CI: 1.12–1.48), grandparents (OR:1.34; 95 % CI: 1.20–1.50) and siblings (OR:1.27; 95 % CI: 1.211.33) all were associated with significantly increased risk of developing hypertension. Conclusions Our results show that the prevalence of hypertension was significantly higher in those with a FH of hypertension. FH of hypertension was also associated with the prevalence of obesity, central obesity and metabolic syndrome. Individuals with a FH of hypertension form an easily identifiable group who may benefit from targeted interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To describe longitudinal height, weight, and body mass index changes up to 15 years after childhood liver transplantation. Study design: Retrospective chart review of patients who underwent liver transplant from 1985-2004 was performed. Subjects were age <18 years at transplant, survived ≥5 years, with at least 2 recorded measurements, of which one was ≥5 years post-transplant. Measurements were recorded pre-transplant, 1, 5, 10, and 15 years later. Results: Height and weight data were available in 98 and 104 patients, respectively; 47% were age <2 years at transplant; 58% were Australian, and the rest were from Japan. Height recovery continued for at least 10 years to reach the 26th percentile (Z-score -0.67) 15 years after transplant. Australians had better growth recovery and attained 47th percentile (Z-score -0.06) at 15 years. Weight recovery was most marked in the first year and continued for 15 years even in well-nourished children. Growth impaired and malnourished children at transplant exhibited the best growth, but remained significantly shorter and lighter even 15 years later. No effect of sex or age at transplant was noted on height or weight recovery. Post-transplant factors significantly impact growth recovery and likely caused the dichotomous growth recovery between Australian and Japanese children; 9% (9/98) of patients were overweight on body mass index calculations at 10-15 years but none were obese. Conclusions: After liver transplant, children can expect ongoing height and weight recovery for at least 10-15 years. Growth impairment at transplant and post-transplant care significantly impact long-term growth recovery. Copyright © 2013 Mosby Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We report here that a protein species with biochemical and immunological similarity with chicken egg riboflavin carrier protein (RCP) is synthesized and secreted by immature rat Sertoli cells in culture. When quantitated by a specific heterologous radioimmunoassay, optimal concentrations of FSH (25 ng/ml) brought about 3-fold stimulation of RCP secretion. FSH, in the presence of testosterone (10−6 M) brought about 6-fold stimulation of secretion of RCP over the control cultures which were maintained in the absence of these two factors. The aromatase inhibitor (1,4,6-androstatrien-3,17-dione) curtailed 85% of the enhanced secretion of RCP, suggesting that the hormonal stimulation is mediated through in situ synthesized estrogen and this could be confirmed with exogenous estradiol-17 β which brought about 3 — fold enhancement of secretion of RCP at a concentration of 10−6 M. When tamoxifen (10 μM) was added along with FSH and testosterone, there was 75% decrease in the enhanced secretion of RCP. Addition of this anti-estrogen together with exogenous estradiol resulted in 55% decrease in elevated levels of RCP. Cholera toxin (1 μg/ml) and 8-bromo-cyclic AMP (0.5 mM) mimicked the action of FSH on the secretion of RCP thus suggesting that FSH stimulation of RCP production may be mediated through cyclic AMP. These findings suggest that estrogen mediates RCP induction in hormonally stimulated sertoli cells presumably to function as the carrier of riboflavin to the developing germ cells through blood-testis barrier in rodents.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment of the outcome of critical illness is complex. Severity scoring systems and organ dysfunction scores are traditional tools in mortality and morbidity prediction in intensive care. Their ability to explain risk of death is impressive for large cohorts of patients, but insufficient for an individual patient. Although events before intensive care unit (ICU) admission are prognostically important, the prediction models utilize data collected at and just after ICU admission. In addition, several biomarkers have been evaluated to predict mortality, but none has proven entirely useful in clinical practice. Therefore, new prognostic markers of critical illness are vital when evaluating the intensive care outcome. The aim of this dissertation was to investigate new measures and biological markers of critical illness and to evaluate their predictive value and association with mortality and disease severity. The impact of delay in emergency department (ED) on intensive care outcome, measured as hospital mortality and health-related quality of life (HRQoL) at 6 months, was assessed in 1537 consecutive patients admitted to medical ICU. Two new biological markers were investigated in two separate patient populations: in 231 ICU patients and 255 patients with severe sepsis or septic shock. Cell-free plasma DNA is a surrogate marker of apoptosis. Its association with disease severity and mortality rate was evaluated in ICU patients. Next, the predictive value of plasma DNA regarding mortality and its association with the degree of organ dysfunction and disease severity was evaluated in severe sepsis or septic shock. Heme oxygenase-1 (HO-1) is a potential regulator of apoptosis. Finally, HO-1 plasma concentrations and HO-1 gene polymorphisms and their association with outcome were evaluated in ICU patients. The length of ED stay was not associated with outcome of intensive care. The hospital mortality rate was significantly lower in patients admitted to the medical ICU from the ED than from the non-ED, and the HRQoL in the critically ill at 6 months was significantly lower than in the age- and sex-matched general population. In the ICU patient population, the maximum plasma DNA concentration measured during the first 96 hours in intensive care correlated significantly with disease severity and degree of organ failure and was independently associated with hospital mortality. In patients with severe sepsis or septic shock, the cell-free plasma DNA concentrations were significantly higher in ICU and hospital nonsurvivors than in survivors and showed a moderate discriminative power regarding ICU mortality. Plasma DNA was an independent predictor for ICU mortality, but not for hospital mortality. The degree of organ dysfunction correlated independently with plasma DNA concentration in severe sepsis and plasma HO-1 concentration in ICU patients. The HO-1 -413T/GT(L)/+99C haplotype was associated with HO-1 plasma levels and frequency of multiple organ dysfunction. Plasma DNA and HO-1 concentrations may support the assessment of outcome or organ failure development in critically ill patients, although their value is limited and requires further evaluation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of shock control to transonic airfoils and wings has been demonstrated widely to have the potential to reduce wave drag. Most of the suggested control devices are two-dimensional, that is they are of uniform geometry in spanwise direction. Examples of such techniques include contour bumps and passive control. Recently it has been observed that a spanwise array of discrete three-dimensional controls can have similar benefits but also offer advantages in terms of installation complexity and drag. This paper describes research carried out in Cambridge into various three-dimensional devices, such as slots, grooves and bumps. In all cases the control device is applied to the interaction of a normal shock wave (M=1.3) with a turbulent boundary layer. Theoretical considerations are proposed to determine how such fundamental experiments can provide estimates of control performance on a transonic wing. The potential of each class of three-dimensional device for wave drag reduction on airfoils is discussed and surface bumps in particular are identified as offering potential drag savings for typical transonic wing applications under cruise conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An experiment was conducted to induce triploidy in African catfish, Clarias gariepinus, using heat shock and cold shock techniques. Cold shock at a temperature of 0± 1°C and 5±1°C for a duration of 15, 30, 45 and 60 min and heat shock at a temperature of 40±0.5°C and 41 ±OS C for a duration of 1, 2 and 3 min was given to induce triploidy 5 min after fertilization. Maximum percentage of triploids (91.4%) were obtained in the heat shock at a temperature of 40±0SC for a duration of 1 min whereas cold shock at 0± 1 C for a duration of 60 min yielded 90% of triploids. Chromosome analysis revealed that diploids have 54 chromosomes and triploids have 81 chromosomes. The erythrocyte measurements of the minor axis and major axis were 1.17 times larger in treated fish than in controls. The growth studies showed that the growth rate was not significantly affected in triploids.