219 resultados para peripheral VF loss
Resumo:
Purpose: To investigate the effect of age on the contributions of the anterior cornea and internal components to ocular aberrations in the peripheral visual field. Methods: Ocular aberrations were measured in 10 young emmetropes and 7 older emmetropes using a modified commercial Hartmann-Shack aberrometer across 42° x 32° of central visual field. Anterior corneal aberrations were estimated from anterior corneal topography using theoretical ray-tracing. Internal aberrations were calculated by subtracting anterior corneal aberrations from ocular aberrations. Results: Anterior corneal aberrations of young subjects were reasonably compensated by the internal aberrations, except for astigmatism for which the internal contribution was small out to the 21° field limit. The internal coma and spherical aberration of the older subjects were considerably smaller in magnitude than those of the young subjects such that the compensation for anterior corneal aberrations was poorer. This can be explained by age-related changes in the lens shape and refractive index distribution. Conclusion: oss of balance between anterior cornea and internal components of higher order aberrations with increasing age, found previously for on-axis vision, applies also to the peripheral visual field.
Resumo:
In this paper, a new comprehensive planning methodology is proposed for implementing distribution network reinforcement. The load growth, voltage profile, distribution line loss, and reliability are considered in this procedure. A time-segmentation technique is employed to reduce the computational load. Options considered range from supporting the load growth using the traditional approach of upgrading the conventional equipment in the distribution network, through to the use of dispatchable distributed generators (DDG). The objective function is composed of the construction cost, loss cost and reliability cost. As constraints, the bus voltages and the feeder currents should be maintained within the standard level. The DDG output power should not be less than a ratio of its rated power because of efficiency. A hybrid optimization method, called modified discrete particle swarm optimization, is employed to solve this nonlinear and discrete optimization problem. A comparison is performed between the optimized solution based on planning of capacitors along with tap-changing transformer and line upgrading and when DDGs are included in the optimization.
Resumo:
Preliminary data is presented on a detailed statistical analysis of k-factor determination for a single class of minerals (amphiboles) which contain a wide range of element concentrations. These amphiboles are homogeneous, contain few (if any) subsolidus microstructures and can be readily prepared for thin film analysis. In previous studies, element loss during the period of irradiation has been assumed negligible for the determination of k-factors. Since this phenomena may be significant for certain mineral systems, we also report on the effect of temperature on k-factor determination for various elements using small probe sizes (approx.20 nm).
Resumo:
Concepts used in this chapter include: Thermoregulation:- Thermoregulation refers to the body’s sophisticated, multi-system regulation of core body temperature. This hierarchical system extends from highly thermo-sensitive neurons in the preoptic region of the brain proximate to the rostral hypothalamus, down to the brain stem and spinal cord. Coupled with receptors in the skin and spine, both central and peripheral information on body temperature is integrated to inform and activate the homeostatic mechanisms which maintain our core temperature at 37oC1. Hyperthermia:- An imbalance between the metabolic and external heat accumulated in the body and the loss of heat from the body2. Exertional heat stroke:- A disorder of excessive heat production coupled with insufficient heat dissipation which occurs in un-acclimated individuals who are engaging in over-exertion in hot and humid conditions. This phenomenon includes central nervous system dysfunction and critical dysfunction to all organ systems including renal, cardiovascular, musculoskeletal and hepatic functions. Non-exertional heat stroke:- In contrast to exertional heatstroke as a consequence of high heat production during strenuous exercise, non-exertional heatstroke results from prolonged exposure to high ambient temperature. The elderly, those with chronic health conditions and children are particularly susceptible.3 Rhabdomylosis:- An acute, sometimes fatal disease characterised by destruction of skeletal muscle. In exertional heat stroke, rhabdomylosis occurs in the context of strenuous exercise when mechanical and/or metabolic stress damages the skeletal muscle, causing elevated serum creatine kinease. Associated with this is the potential development of hyperkalemia, myoglobinuria and renal failure. Malignant hyperthermia:- Malignant hyperthermia is “an inherited subclinical myopathy characterised by a hypermetabolic reaction during anaesthesia. The reaction is related to skeletal muscle calcium dysregulation triggered by volatile inhaled anaesthetics and/or succinylcholine.”4 Presentation includes skeletal muscle rigidity, mixed metabolic and respiratory acidosis, tachycardia, hyperpyrexia, rhabdomylosis, hyperkalaemia, elevated serum creatine kinease, multi-organ failure, disseminated intravascular coagulation and death.5
Resumo:
Concepts used in this chapter include: Thermoregulation:- Thermoregulation refers to the body’s sophisticated, multi-system regulation of core body temperature. This hierarchical system extends from highly thermo-sensitive neurons in the preoptic region of the brain proximate to the rostral hypothalamus, down to the brain stem and spinal cord. Coupled with receptors in the skin and spine, both central and peripheral information on body temperature is integrated to inform and activate the homeostatic mechanisms which maintain our core temperature at 37oC.1 Body heat is lost through the skin, via respiration and excretions. The skin is perhaps the most important organ in regulating heat loss. Hyporthermia:- Hypothermia is defined as core body temperature less than 350C and is the result of imbalance between the body’s heat production and heat loss mechanisms. Hypothermia may be accidental, or induced for clinical benefit i.e: neurological protection (therapeutic hypothermia). External environmental conditions are the most common cause of accidental hypothermia, but not the only causes of hypothermia in humans. Other causes include metabolic imbalance; trauma; neurological and infectious disease; and exposure to toxins such as organophosphates. Therapeutic Hypothermia:- In some circumstances, hypothermia can be induced to protect neurological functioning as a result of the associated decrease in cerebral metabolism and energy consumption. Reduction in the extent of degenerative processes associated with periods of ischaemia such as excitotoxic cascade; apoptotic and necrotic cell death; microglial activation; oxidative stress and inflammation associated with ischaemia are averted or minimised.2 Mild hypothermia is the only effective treatment confirmed clinically for improving the neurological outcomes of patient’s comatose following cardiac arrest.3
Resumo:
Bovine colostrum has been shown to influence the cytokine production of bovine leukocytes. However, it remains unknown whether processed bovine colostrum, a supplement popular among athletes to enhance immune function, is able to modulate cytokine secretion of human lymphocytes and monocytes. The aim of this investigation was to determine the influence of a commercially available bovine colostrum protein concentrate (CPC) to stimulate cytokine production by human peripheral blood mononuclear cells (PBMCs). Blood was sampled from four healthy male endurance athletes who had abstained from exercise for 48 h. PBMCs were separated and cultured with bovine CPC concentrations of 0 (control), 1.25, 2.5, and 5% with and without lipopolysaccharide (LPS) (3 microg/mL) and phytohemagglutinin (PHA) (2.5 microg/mL). Cell supernatants were collected at 6 and 24 h of culture for the determination of tumor necrosis factor (TNF), interferon (IFN)-gamma, interleukin (IL)-10, IL-6, IL-4, and IL-2 concentrations. Bovine CPC significantly stimulated the release of IFN-gamma, IL-10, and IL-2 (p < 0.03). The addition of LPS to PBMCs cocultured with bovine CPC significantly stimulated the release of IL-2 and inhibited the early release of TNF, IL-6, and IL-4 (p < 0.02). Phytohemagglutinin stimulation in combination with bovine CPC significantly increased the secretion of IL-10 and IL-2 at 6 h of culture and inhibited IFN-gamma and TNF (p < 0.05). This data show that a commercial bovine CPC is able to modulate in vitro cytokine production of human PBMCs. Alterations in cytokine secretion may be a potential mechanism for reported benefits associated with supplementation.
Resumo:
There has been a low level of interest in peripheral aberrations and corresponding image quality for over 200 years. Most work has been concerned with the second-order aberrations of defocus and astigmatism that can be corrected with conventional lenses. Studies have found high levels of aberration, often amounting to several dioptres, even in eyes with only small central defocus and astigmatism. My investigations have contributed to understanding shape changes in the eye with increases in myopia, changes in eye optics with ageing, and how surgical interventions intended to correct central refractive errors have unintended effects on peripheral optics. My research group has measured peripheral second- and higher-order aberrations over a 42° horizontal × 32° vertical diameter visual field. There is substantial variation in individual aberrations with age and pathology. While the higher-order aberrations in the periphery are usually small compared with second-order aberrations, they can be substantial and change considerably after refractive surgery. The thrust of my research in the next few years is to understand more about the peripheral aberrations of the human eye, to measure visual performance in the periphery and determine whether this can be improved by adaptive optics correction, to use measurements of peripheral aberrations to learn more about the optics of the eye and in particular the gradient index structure of the lens, and to investigate ways of increasing the size of the field of good retinal image quality.
Resumo:
Mixtures of single odours were used to explore the receptor response profile across individual antennae of Helicoverpa armigera (Hübner) (Lepidoptera: Noctuidae). Seven odours were tested including floral and green-leaf volatiles: phenyl acetaldehyde, benzaldehyde, β-caryophyllene, limonene, α-pinene, 1-hexanol, 3Z-hexenyl acetate. Electroantennograms of responses to paired mixtures of odours showed that there was considerable variation in receptor tuning across the receptor field between individuals. Data from some moth antennae showed no additivity, which indicated a restricted receptor profile. Results from other moth antennae to the same odour mixtures showed a range of partial additivity. This indicated that a wider array of receptor types was present in these moths, with a greater percentage of the receptors tuned exclusively to each odour. Peripheral receptor fields show variation in the spectrum of response within a population (of moths) when exposed to high doses of plant volatiles. This may be related to recorded variation in host choice within moth populations as reported by other authors.
Resumo:
Background Recent evidence has linked induced abortion with later adverse psychiatric outcomes in young women. Aims To examine whether abortion or miscarriage are associated with subsequent psychiatric and substance use disorders. Method A sample (n=1223) of women from a cohort born between 1981 and 1984 in Australia were assessed at 21 years for psychiatric and substance use disorders and lifetime pregnancy histories. Results Young women reporting a pregnancy loss had nearly three times the odds of experiencing a lifetime illicit drug disorder (excluding cannabis): abortion odds ratio (OR)=3.6 (95% CI 2.0–6.7) and miscarriage OR=2.6 (95% CI 1.2–5.4). Abortion was associated with alcohol use disorder (OR=2.1, 95% CI 1.3–3.5) and 12-month depression (OR=1.9, 95% CI 1.1–3.1). Conclusions These findings add to the growing body of evidence suggesting that pregnancy loss per se, whether abortion or miscarriage, increases the risk of a range of substance use disorders and affective disorders in young women.
Resumo:
Between 8 and 20 percent of depression in young men and women aged 18-23 is associated with pregnancy loss, according to a recent analysis of the 30 year Mater Hospital longitudinal study of mothers and children. Dr Kaeleen Dingle from the University of Queensland explains the study and discusses the implications for both men and women.
Resumo:
Recent evidence has linked induced abortion with later adverse psychiatric outcomes in young women. Little is known about later adverse psychiatric outcomes in young men whose partners have fallen pregnant and either go on to have a child, have an abortion or miscarry. 1223 women and 1159 men, from an Austrailan cohort born between 1981 and 1984, were assessed at 21 years for psychiatric and substance misuse and lifetime pregnancy histories. Young women reporting a pregnancy loss (either miscarriage or abortion) had nearly three times the odds of experiencing a illicit drug disorder (excluding cannabis), and nearly twice the odds of an alcohol misuse compared to never pregnant women. Young men whose partner had an abortion, but not a miscarriage, had nearly twice the odds of cannabis disorder, illicit drug disorder, and mood disorder compared to men that had never fathered a pregnancy. Young women who have lost a pregnancy have an increased risk of developing alcohol or substance abuse in later life. Young men whose partner aborted a pregnancy only had an increased of substance abuse and mood disorder in later life. These findings add to the growing body of evidence suggesting that pregnancy loss per se increases the risk of a range of substance use disorders in young women. The findings for young men are novel and raise the possibility that the associations measured may be due to common unmeasured factors associated with early pregnancy in young people rather than pregnancy loss.
Resumo:
We employed a Hidden-Markov-Model (HMM) algorithm in loss of heterozygosity (LOH) analysis of high-density single nucleotide polymorphism (SNP) array data from Non-Hodgkin’s lymphoma (NHL) entities, follicular lymphoma (FL), and diffuse large B-cell lymphoma (DLBCL). This revealed a high frequency of LOH over the chromosomal region 11p11.2, containing the gene encoding the protein tyrosine phosphatase receptor type J (PTPRJ). Although PTPRJ regulates components of key survival pathways in B-cells (i.e., BCR, MAPK, and PI3K signaling), its role in B-cell development is poorly understood. LOH of PTPRJ has been described in several types of cancer but not in any hematological malignancy. Interestingly, FL cases with LOH exhibited down-regulation of PTPRJ, in contrast no significant variation of expression was shown in DLBCLs. In addition, sequence screening in Exons 5 and 13 of PTPRJ identified the G973A (rs2270993), T1054C (rs2270992), A1182C (rs1566734), and G2971C (rs4752904) coding SNPs (cSNPs). The A1182 allele was significantly more frequent in FLs and in NHLs with LOH. Significant over-representation of the C1054 (rs2270992) and the C2971 (rs4752904) alleles were also observed in LOH cases. A haplotype analysis also revealed a significant lower frequency of haplotype GTCG in NHL cases, but it was only detected in cases with retention. Conversely, haplotype GCAC was over-representated in cases with LOH. Altogether, these results indicate that the inactivation of PTPRJ may be a common lymphomagenic mechanism in these NHL subtypes and that haplotypes in PTPRJ gene may play a role in susceptibility to NHL, by affecting activation of PTPRJ in these B-cell lymphomas.
Resumo:
Migraine is a debilitating neurovascular disorder, with a substantial genetic component. The exact cause of a migraine attack is unknown; however cortical hyperexcitability is thought to play a role. As Gamma-aminobutyric Acid (GABA) is the major inhibitory neurotransmitter in the brain, malfunctioning of this system may be a cause of the hyperexcitability. To date, there has been limited research examining the gene expression or genetics of GABA receptors in relation to migraine. The aim of our study was to determine if GABA receptors play a role in migraine by investigating their gene expression using profile in migraine affected individuals and non-affected controls by Q-PCR. Gene expression of GABA(A) receptor subunit isoforms (GABRA3, GABRB3, GABRQ) and GABA(B) receptor 2 (GABBR2) was quantified in mRNA obtained from peripheral blood leukocytes from 28 migraine subjects and 22 healthy control subjects. Analysis of results showed that two of the tested genes, GABRA3 and GABBR2, were significantly down regulated in migraineurs (P=0.018; P=0.017), compared to controls. Results from the other tested genes did not show significant gene expression variation. The results indicate that there may be specific GABA receptor gene expression variation in migraine, particularly involving the GABRA3 and GABBR2 genes. This study also identifies GABRA3 and GABBR2 as potential biomarkers to select migraineurs that may be more responsive to GABA agonists with future investigations in this area warranted.
Resumo:
Background Loss of heterozygosity (LOH) is an important marker for one of the 'two-hits' required for tumor suppressor gene inactivation. Traditional methods for mapping LOH regions require the comparison of both tumor and patient-matched normal DNA samples. However, for many archival samples, patient-matched normal DNA is not available leading to the under-utilization of this important resource in LOH studies. Here we describe a new method for LOH analysis that relies on the genome-wide comparison of heterozygosity of single nucleotide polymorphisms (SNPs) between cohorts of cases and un-matched healthy control samples. Regions of LOH are defined by consistent decreases in heterozygosity across a genetic region in the case cohort compared to the control cohort. Methods DNA was collected from 20 Follicular Lymphoma (FL) tumor samples, 20 Diffuse Large B-cell Lymphoma (DLBCL) tumor samples, neoplastic B-cells of 10 B-cell Chronic Lymphocytic Leukemia (B-CLL) patients and Buccal cell samples matched to 4 of these B-CLL patients. The cohort heterozygosity comparison method was developed and validated using LOH derived in a small cohort of B-CLL by traditional comparisons of tumor and normal DNA samples, and compared to the only alternative method for LOH analysis without patient matched controls. LOH candidate regions were then generated for enlarged cohorts of B-CLL, FL and DLBCL samples using our cohort heterozygosity comparison method in order to evaluate potential LOH candidate regions in these non-Hodgkin's lymphoma tumor subtypes. Results Using a small cohort of B-CLL samples with patient-matched normal DNA we have validated the utility of this method and shown that it displays more accuracy and sensitivity in detecting LOH candidate regions compared to the only alternative method, the Hidden Markov Model (HMM) method. Subsequently, using B-CLL, FL and DLBCL tumor samples we have utilised cohort heterozygosity comparisons to localise LOH candidate regions in these subtypes of non-Hodgkin's lymphoma. Detected LOH regions included both previously described regions of LOH as well as novel genomic candidate regions. Conclusions We have proven the efficacy of the use of cohort heterozygosity comparisons for genome-wide mapping of LOH and shown it to be in many ways superior to the HMM method. Additionally, the use of this method to analyse SNP microarray data from 3 common forms of non-Hodgkin's lymphoma yielded interesting tumor suppressor gene candidates, including the ETV3 gene that was highlighted in both B-CLL and FL.
Resumo:
BACKGROUND: US Centers for Disease Control guidelines recommend replacement of peripheral intravenous (IV) catheters no more frequently than every 72 to 96 hours. Routine replacement is thought to reduce the risk of phlebitis and bloodstream infection. Catheter insertion is an unpleasant experience for patients and replacement may be unnecessary if the catheter remains functional and there are no signs of inflammation. Costs associated with routine replacement may be considerable. This is an update of a review first published in 2010. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present. OBJECTIVES: To assess the effects of removing peripheral IV catheters when clinically indicated compared with removing and re-siting the catheter routinely. SEARCH METHODS: For this update the Cochrane Peripheral Vascular Diseases (PVD) Group Trials Search Co-ordinator searched the PVD Specialised Register (December 2012) and CENTRAL (2012, Issue 11). We also searched MEDLINE (last searched October 2012) and clinical trials registries. SELECTION CRITERIA: Randomised controlled trials that compared routine removal of peripheral IV catheters with removal only when clinically indicated in hospitalised or community dwelling patients receiving continuous or intermittent infusions. DATA COLLECTION AND ANALYSIS: Two review authors independently assessed trial quality and extracted data. MAIN RESULTS: Seven trials with a total of 4895 patients were included in the review. Catheter-related bloodstream infection (CRBSI) was assessed in five trials (4806 patients). There was no significant between group difference in the CRBSI rate (clinically-indicated 1/2365; routine change 2/2441). The risk ratio (RR) was 0.61 but the confidence interval (CI) was wide, creating uncertainty around the estimate (95% CI 0.08 to 4.68; P = 0.64). No difference in phlebitis rates was found whether catheters were changed according to clinical indications or routinely (clinically-indicated 186/2365; 3-day change 166/2441; RR 1.14, 95% CI 0.93 to 1.39). This result was unaffected by whether infusion through the catheter was continuous or intermittent. We also analysed the data by number of device days and again no differences between groups were observed (RR 1.03, 95% CI 0.84 to 1.27; P = 0.75). One trial assessed all-cause bloodstream infection. There was no difference in this outcome between the two groups (clinically-indicated 4/1593 (0.02%); routine change 9/1690 (0.05%); P = 0.21). Cannulation costs were lower by approximately AUD 7.00 in the clinically-indicated group (mean difference (MD) -6.96, 95% CI -9.05 to -4.86; P ≤ 0.00001). AUTHORS' CONCLUSIONS: The review found no evidence to support changing catheters every 72 to 96 hours. Consequently, healthcare organisations may consider changing to a policy whereby catheters are changed only if clinically indicated. This would provide significant cost savings and would spare patients the unnecessary pain of routine re-sites in the absence of clinical indications. To minimise peripheral catheter-related complications, the insertion site should be inspected at each shift change and the catheter removed if signs of inflammation, infiltration, or blockage are present.