978 resultados para Holmes, Abiel, 1763-1837.
Resumo:
Glucocorticoid hormones are critical to respond and adapt to stress. Genetic variations in the glucocorticoid receptor (GR) gene alter hypothalamic-pituitary-adrenal (HPA) axis activity and associate with hypertension and susceptibility to metabolic disease. Here we test the hypothesis that reduced GR density alters blood pressure and glucose and lipid homeostasis and limits adaption to obesogenic diet. Heterozygous GR βgeo/+ mice were generated from embryonic stem (ES) cells with a gene trap integration of a β-galactosidase-neomycin phosphotransferase (βgeo) cassette into the GR gene creating a transcriptionally inactive GR fusion protein. Although GRβgeo/+ mice have 50% less functional GR, they have normal lipid and glucose homeostasis due to compensatory HPA axis activation but are hypertensive due to activation of the renin-angiotensin- aldosterone system (RAAS). When challenged with a high-fat diet, weight gain, adiposity, and glucose intolerance were similarly increased in control and GRβgeo/+ mice, suggesting preserved control of intermediary metabolism and energy balance. However, whereas a high-fat diet caused HPA activation and increased blood pressure in control mice, these adaptions were attenuated or abolished in GRβgeo/+ mice. Thus, reduced GR density balanced by HPA activation leaves glucocorticoid functions unaffected but mineralocorticoid functions increased, causing hypertension. Importantly, reduced GR limits HPA and blood pressure adaptions to obesogenic diet.
Resumo:
Functional MRI studies commonly refer to activation patterns as being localized in specific Brodmann areas, referring to Brodmann’s divisions of the human cortex based on cytoarchitectonic boundaries [3]. Typically, Brodmann areas that match regions in the group averaged functional maps are estimated by eye, leading to inaccurate parcellations and significant error. To avoid this limitation, we developed a method using high-dimensional nonlinear registration to project the Brodmann areas onto individual 3D co-registered structural and functional MRI datasets, using an elastic deformation vector field in the cortical parameter space. Based on a sulcal pattern matching approach [11], an N=27 scan single subject atlas (the Colin Holmes atlas [15]) with associated Brodmann areas labeled on its surface, was deformed to match 3D cortical surface models generated from individual subjects’ structural MRIs (sMRIs). The deformed Brodmann areas were used to quantify and localize functional MRI (fMRI) BOLD activation during the performance of the Tower of London task [7].
Resumo:
Our world is literally and figuratively turning to ‘dust’. This work acknowledges decay and renewal and the transitional, cyclical natures of interrelated ecologies. It also suggests advanced levels of degradation potentially beyond reparation. Dust exists both on and beneath the border of our unaided vision. Dust particles are predominantly forms of disintegrating solids that often become the substance or catalyst of future forms. Like many tiny forms, dust is an often unnoticed residue with ‘planet-size consequences’. (Hanna Holmes 2001) The image depicts an ethereal, backlit body, continually circling and morphing, apparently floating, suggesting endless cycles of birth, life and death and inviting differing states of meditation, exploration, stillness and play. This never ending video work is taken from a large-scale interactive/media artwork created during a six-month research residency in England at the Institute of Contemporary Art London and at Vincent Dance Theatre Sheffield in 2006. It was originally presented on a raised floor screen made of pure white sand at the ICA in London (see). The project involved developing new interaction, engagement and image making strategies for media arts practice, drawing on the application of both kinetic and proprioceptive dance/performance knowledges. The work was further informed by ecological network theory that assesses the systemic implications of private and public actions within bounded systems. The creative methodology was primarily practice-led which fomented the particular qualities of imagery, generated through cross-fertilising embodied knowledge of Dance and Media Arts. This was achieved through extensive workshopping undertaken in theatres, working ‘on the floor’ live, with dancers, props, sound and projection. And eventually of course, all this dust must settle. (Holmes 2001, from Dust Jacket) Holmes, H. 2001, The Secret Life of Dust: From the Cosmos to the Kitchen Counter, the Big Consequences of Little Things, p.3
Resumo:
There are three distinct categories of air environment to be considered in this chapter. These are as follows: (1) The “ambient” or general outdoors atmosphere to which the members of the population are exposed when they venture out of their homes or offices in industrial, urban or rural environments. (2) Indoor air environments, which occur in buildings such as homes, schools, restaurants, public hospitals and office buildings. This category does not cover factories or workplaces which are otherwise subjected to the provisions of various occupational health standards. (3) Workplace atmospheres, which occur in a variety of industries or factories and for which there are numerous atmospheric concentration limits (or exposure standards) promulgated by appropriate bodies or organisations. Since 2009 setting concentration limits for atmospheric contaminants has been administered by Safe Work Australia. A fourth category of air environment which falls outside this chapter is that which is related to upper atmospheric research, global atmospheric effects and concomitant areas of inquiry and/or debate. Such areas include “greenhouse” gas emissions, ozone depletion, and related matters of atmospheric chemistry and physics. This category is not referred to again in this chapter.
Resumo:
Background Stroke incidence has fallen since 1950. Recent trends suggest that stroke incidence may be stabilizing or increasing. We investigated time trends in stroke occurrence and in-hospital morbidity and mortality in the Calgary Health Region. Methods All patients admitted to hospitals in the Calgary Health Region between 1994 and 2002 with a primary discharge diagnosis code (ICD-9 or ICD-10) of stroke were included. In-hospital strokes were also included. Stroke type, date of admission, age, gender,discharge disposition (died, discharged) and in-hospital complications (pneumonia, pulmonary embolism, deep venous thrombosis) were recorded. Poisson and simple linear regression was used to model time trends of occurrence by stroke type and age-group and to extrapolate future time trends. Results From 1994 to 2002, 11642 stroke events were observed. Of these, 9879 patients (84.8%) were discharged from hospital, 1763 (15.1%) died in hospital, and 591 (5.1%) developed in-hospital complications from pneumonia, pulmonary embolism or deep venous thrombosis. Both in-hospital mortality and complication rates were highest for hemorrhages. Over the period of study, the rate of stroke admission has remained stable. However, total numbers of stroke admission to hospital have faced a significant increase (p=0.012) due to the combination of increases in intracerebral hemorrhage (p=0.021) and ischemic stroke admissions (p=0.011). Sub-arachnoid hemorrhage rates have declined. In-hospital stroke mortality has experienced an overall decline due to a decrease in deaths from ischemic stroke, intracerebral hemorrhage and sub-arachnoid hemorrhage. Conclusions Although age-adjusted stroke occurrence rates were stable from 1994 to 2002, this is associated with both a sharp increase in the absolute number of stroke admissions and decline in proportional in-hospital mortality. Further research is needed into changes in stroke severity over time to understand the causes of declining in-hospital stroke mortality rates.
Resumo:
Sherlock Holmes faces his greatest challenge – since his fight to the death with Professor James Moriarty at Reichenbach Falls. Who owns Sherlock Holmes, the world’s greatest detective? Is it the estate of Sir Arthur Conan Doyle? Or the mysterious socialite Andrea Plunket? Or does Sherlock Holmes belong to the public? This is the question currently being debated in copyright litigation in the United States courts, raising larger questions about copyright law and the public domain, the ownership of literary characters, and the role of sequels, adaptations, and mash-ups.
Resumo:
The phase relations have been investigated experimentally at 200 and 500 MPa as a function of water activity for one of the least evolved (Indian Batt Rhyolite) and of a more evolved rhyolite composition (Cougar Point Tuff XV) from the 12·8-8·1 Ma Bruneau-Jarbidge eruptive center of the Yellowstone hotspot. Particular priority was given to accurate determination of the water content of the quenched glasses using infrared spectroscopic techniques. Comparison of the composition of natural and experimentally synthesized phases confirms that high temperatures (>900°C) and extremely low melt water contents (<1·5 wt % H₂O) are required to reproduce the natural mineral assemblages. In melts containing 0·5-1·5 wt % H₂O, the liquidus phase is clinopyroxene (excluding Fe-Ti oxides, which are strongly dependent on fO₂), and the liquidus temperature of the more evolved Cougar Point Tuff sample (BJR; 940-1000°C) is at least 30°C lower than that of the Indian Batt Rhyolite lava sample (IBR2; 970-1030°C). For the composition BJR, the comparison of the compositions of the natural and experimental glasses indicates a pre-eruptive temperature of at least 900°C. The composition of clinopyroxene and pigeonite pairs can be reproduced only for water contents below 1·5 wt % H₂O at 900°C, or lower water contents if the temperature is higher. For the composition IBR2, a minimum temperature of 920°C is necessary to reproduce the main phases at 200 and 500 MPa. At 200 MPa, the pre-eruptive water content of the melt is constrained in the range 0·7-1·3 wt % at 950°C and 0·3-1·0 wt % at 1000°C. At 500 MPa, the pre-eruptive temperatures are slightly higher (by 30-50°C) for the same ranges of water concentration. The experimental results are used to explore possible proxies to constrain the depth of magma storage. The crystallization sequence of tectosilicates is strongly dependent on pressure between 200 and 500 MPa. In addition, the normative Qtz-Ab-Or contents of glasses quenched from melts coexisting with quartz, sanidine and plagioclase depend on pressure and melt water content, assuming that the normative Qtz and Ab/Or content of such melts is mainly dependent on pressure and water activity, respectively. The combination of results from the phase equilibria and from the composition of glasses indicates that the depth of magma storage for the IBR2 and BJR compositions may be in the range 300-400 MPa (13 km) and 200-300 MPa (10 km), respectively.
Resumo:
The highly complex structure of the human brain is strongly shaped by genetic influences. Subcortical brain regions form circuits with cortical areas to coordinate movement, learning, memory and motivation, and altered circuits can lead to abnormal behaviour and disease. To investigate how common genetic variants affect the structure of these brain regions, here we conduct genome-wide association studies of the volumes of seven subcortical regions and the intracranial volume derived from magnetic resonance images of 30,717 individuals from 50 cohorts. We identify five novel genetic variants influencing the volumes of the putamen and caudate nucleus. We also find stronger evidence for three loci with previously established influences on hippocampal volume and intracranial volume. These variants show specific volumetric effects on brain structures rather than global effects across structures. The strongest effects were found for the putamen, where a novel intergenic locus with replicable influence on volume (rs945270; P = 1.08×10 -33; 0.52% variance explained) showed evidence of altering the expression of the KTN1 gene in both brain and blood tissue. Variants influencing putamen volume clustered near developmental genes that regulate apoptosis, axon guidance and vesicle transport. Identification of these genetic variants provides insight into the causes of variability in human brain development, and may help to determine mechanisms of neuropsychiatric dysfunction.
Resumo:
The arcuate fasciculus (AF), a white matter tract linking temporal and inferior frontal language cortices, can be disrupted in stroke patients suffering from aphasia. Using diffusion tensor imaging (DTI) tractography it is possible to track AF connections to neural regions associated with either phonological or semantic linguistic processing. The aim of the current study is to investigate the relationship between integrity of white matter microstructure and specific linguistic deficits.
Resumo:
Identifying genetic variants influencing human brain structures may reveal new biological mechanisms underlying cognition and neuropsychiatric illness. The volume of the hippocampus is a biomarker of incipient Alzheimer's disease and is reduced in schizophrenia, major depression and mesial temporal lobe epilepsy. Whereas many brain imaging phenotypes are highly heritable, identifying and replicating genetic influences has been difficult, as small effects and the high costs of magnetic resonance imaging (MRI) have led to underpowered studies. Here we report genome-wide association meta-analyses and replication for mean bilateral hippocampal, total brain and intracranial volumes from a large multinational consortium. The intergenic variant rs7294919 was associated with hippocampal volume (12q24.22; N = 21,151; P = 6.70 × 10 -16) and the expression levels of the positional candidate gene TESC in brain tissue. Additionally, rs10784502, located within HMGA2, was associated with intracranial volume (12q14.3; N = 15,782; P = 1.12 × 10 -12). We also identified a suggestive association with total brain volume at rs10494373 within DDR2 (1q23.3; N = 6,500; P = 5.81 × 10 -7).
Resumo:
The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.
Resumo:
Big Datasets are endemic, but they are often notoriously difficult to analyse because of their size, heterogeneity, history and quality. The purpose of this paper is to open a discourse on the use of modern experimental design methods to analyse Big Data in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has wide generality and advantageous inferential and computational properties. In particular, the principled experimental design approach is shown to provide a flexible framework for analysis that, for certain classes of objectives and utility functions, delivers near equivalent answers compared with analyses of the full dataset under a controlled error rate. It can also provide a formalised method for iterative parameter estimation, model checking, identification of data gaps and evaluation of data quality. Finally, it has the potential to add value to other Big Data sampling algorithms, in particular divide-and-conquer strategies, by determining efficient sub-samples.