485 resultados para Large isoform of rubisco activase
Resumo:
Although key to understanding individual variation in task-related brain activation, the genetic contribution to these individual differences remains largely unknown. Here we report voxel-by-voxel genetic model fitting in a large sample of 319 healthy, young adult, human identical and fraternal twins (mean ± SD age, 23.6 ±1.8 years) who performed an n-back working memory task during functional magnetic resonance imaging (fMRI) at a high magnetic field (4 tesla). Patterns of task-related brain response (BOLD signal difference of 2-back minus 0-back) were significantly heritable, with the highest estimates (40 - 65%) in the inferior, middle, and superior frontal gyri, left supplementary motor area, precentral and postcentral gyri, middle cingulate cortex, superior medial gyrus, angular gyrus, superior parietal lobule, including precuneus, and superior occipital gyri. Furthermore, high test-retest reliability for a subsample of 40 twins indicates that nongenetic variance in the fMRI brain response is largely due to unique environmental influences rather than measurement error. Individual variations in activation of the working memory network are therefore significantly influenced by genetic factors. By establishing the heritability of cognitive brain function in a large sample that affords good statistical power, and using voxel-by-voxel analyses, this study provides the necessary evidence for task-related brain activation to be considered as an endophenotype for psychiatric or neurological disorders, and represents a substantial new contribution to the field of neuroimaging genetics. These genetic brain maps should facilitate discovery of gene variants influencing cognitive brain function through genome-wide association studies, potentially opening up new avenues in the treatment of brain disorders.
Resumo:
Modern non-invasive brain imaging technologies, such as diffusion weighted magnetic resonance imaging (DWI), enable the mapping of neural fiber tracts in the white matter, providing a basis to reconstruct a detailed map of brain structural connectivity networks. Brain connectivity networks differ from random networks in their topology, which can be measured using small worldness, modularity, and high-degree nodes (hubs). Still, little is known about how individual differences in structural brain network properties relate to age, sex, or genetic differences. Recently, some groups have reported brain network biomarkers that enable differentiation among individuals, pairs of individuals, and groups of individuals. In addition to studying new topological features, here we provide a unifying general method to investigate topological brain networks and connectivity differences between individuals, pairs of individuals, and groups of individuals at several levels of the data hierarchy, while appropriately controlling false discovery rate (FDR) errors. We apply our new method to a large dataset of high quality brain connectivity networks obtained from High Angular Resolution Diffusion Imaging (HARDI) tractography in 303 young adult twins, siblings, and unrelated people. Our proposed approach can accurately classify brain connectivity networks based on sex (93% accuracy) and kinship (88.5% accuracy). We find statistically significant differences associated with sex and kinship both in the brain connectivity networks and in derived topological metrics, such as the clustering coefficient and the communicability matrix.
Resumo:
Brain connectivity analyses are increasingly popular for investigating organization. Many connectivity measures including path lengths are generally defined as the number of nodes traversed to connect a node in a graph to the others. Despite its name, path length is purely topological, and does not take into account the physical length of the connections. The distance of the trajectory may also be highly relevant, but is typically overlooked in connectivity analyses. Here we combined genotyping, anatomical MRI and HARDI to understand how our genes influence the cortical connections, using whole-brain tractography. We defined a new measure, based on Dijkstra's algorithm, to compute path lengths for tracts connecting pairs of cortical regions. We compiled these measures into matrices where elements represent the physical distance traveled along tracts. We then analyzed a large cohort of healthy twins and show that our path length measure is reliable, heritable, and influenced even in young adults by the Alzheimer's risk gene, CLU.
Resumo:
Genetic analysis of diffusion tensor images (DTI) shows great promise in revealing specific genetic variants that affect brain integrity and connectivity. Most genetic studies of DTI analyze voxel-based diffusivity indices in the image space (such as 3D maps of fractional anisotropy) and overlook tract geometry. Here we propose an automated workflow to cluster fibers using a white matter probabilistic atlas and perform genetic analysis on the shape characteristics of fiber tracts. We apply our approach to large study of 4-Tesla high angular resolution diffusion imaging (HARDI) data from 198 healthy, young adult twins (age: 20-30). Illustrative results show heritability for the shapes of several major tracts, as color-coded maps.
Resumo:
Automatic labeling of white matter fibres in diffusion-weighted brain MRI is vital for comparing brain integrity and connectivity across populations, but is challenging. Whole brain tractography generates a vast set of fibres throughout the brain, but it is hard to cluster them into anatomically meaningful tracts, due to wide individual variations in the trajectory and shape of white matter pathways. We propose a novel automatic tract labeling algorithm that fuses information from tractography and multiple hand-labeled fibre tract atlases. As streamline tractography can generate a large number of false positive fibres, we developed a top-down approach to extract tracts consistent with known anatomy, based on a distance metric to multiple hand-labeled atlases. Clustering results from different atlases were fused, using a multi-stage fusion scheme. Our "label fusion" method reliably extracted the major tracts from 105-gradient HARDI scans of 100 young normal adults. © 2012 Springer-Verlag.
Resumo:
Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.
Resumo:
We demonstrate a geometrically inspired technique for computing Evans functions for the linearised operators about travelling waves. Using the examples of the F-KPP equation and a Keller–Segel model of bacterial chemotaxis, we produce an Evans function which is computable through several orders of magnitude in the spectral parameter and show how such a function can naturally be extended into the continuous spectrum. In both examples, we use this function to numerically verify the absence of eigenvalues in a large region of the right half of the spectral plane. We also include a new proof of spectral stability in the appropriate weighted space of travelling waves of speed c≥sqrt(2δ) in the F-KPP equation.
Resumo:
This paper presents a novel vision-based underwater robotic system for the identification and control of Crown-Of-Thorns starfish (COTS) in coral reef environments. COTS have been identified as one of the most significant threats to Australia's Great Barrier Reef. These starfish literally eat coral, impacting large areas of reef and the marine ecosystem that depends on it. Evidence has suggested that land-based nutrient runoff has accelerated recent outbreaks of COTS requiring extensive use of divers to manually inject biological agents into the starfish in an attempt to control population numbers. Facilitating this control program using robotics is the goal of our research. In this paper we introduce a vision-based COTS detection and tracking system based on a Random Forest Classifier (RFC) trained on images from underwater footage. To track COTS with a moving camera, we embed the RFC in a particle filter detector and tracker where the predicted class probability of the RFC is used as an observation probability to weight the particles, and we use a sparse optical flow estimation for the prediction step of the filter. The system is experimentally evaluated in a realistic laboratory setup using a robotic arm that moves a camera at different speeds and heights over a range of real-size images of COTS in a reef environment.
Resumo:
Purpose The purpose of this paper is to explore the contribution of global business services to improved productivity and economic growth of the world economy, which has gone largely unnoticed in service research. Design/methodology/approach The authors draw on macroeconomic data and industry reports, and link them to the non-ownership-concept in service research and theories of the firm. Findings Business services explain a large share of the growth of the global service economy. The fast growth of business services coincides with shifts from domestic production towards global outsourcing of services. A new wave of global business services are traded across borders and have emerged as important drivers of growth in the world’s service sector. Research limitations/implications This paper advances the understanding of non-ownership services in an increasingly global and specialized post-industrial economy. The paper makes a conceptual contribution supported by descriptive data, but without empirical testing. Originality/value The authors integrate the non-ownership concept and three related economic theories of the firm to explain the role of global business services in driving business performance and the international transformation of service economies.
Resumo:
Nutrition plays an important role in the development of all organisms and in particular that of farmed aquatic species where costs associated with feed can often exceed 60% of total production costs. Crustacean species in addition, have the added metabolic requirement for regular moulting to allow normal growth and this requires large amounts of energy in the form of sugars (glucose). The current study explored the capacity of the giant freshwater prawn to produce endogenous cellulose-degrading enzymes capable of extracting nutrients (simple sugars) from plant sources in formulated feeds used in the prawn aquaculture industry. We identified a putative cellulase cDNA fragment in the target organism of 1576 base pairs in length of non-microbial origin that after protein modelling exhibited a TM-score of 0.916 with a described cellulase reported from another crustacean species. The functional role of cellulase enzymes is to hydrolyse cellulose to glucose and the fragment identified in GFP was highly expressed in the hepatopancreas, the site of primary food digestion and absorption in crustaceans. Hepatopancreatic tissue from Macrobrachium rosenbergii also showed active digestion of cellulose to glucose following an endoglucanase assay. Cellulase gene(s) are present in the genomes of many invertebrate taxa and play an active role in the conversion of cellulose to available energy. Identification and characterization of endogenous cellulase gene(s) in giant freshwater prawn can assist development of the culture industry because the findings confirm that potentially greater levels of low-cost plant-material could be included in artificial formulated diets in the future without necessarily compromising individual growth performance. Ultimately, this development may contribute to more efficient, cost-effective production systems for freshwater prawn culture stocks that meet the animal's basic nutritional requirements and that also support good individual growth rates.
Resumo:
Invasive non-native plants have negatively impacted on biodiversity and ecosystem functions world-wide. Because of the large number of species, their wide distributions and varying degrees of impact, we need a more effective method for prioritizing control strategies for cost-effective investment across heterogeneous landscapes. Here, we develop a prioritization framework that synthesizes scientific data, elicits knowledge from experts and stakeholders to identify control strategies, and appraises the cost-effectiveness of strategies. Our objective was to identify the most cost-effective strategies for reducing the total area dominated by high-impact non-native plants in the Lake Eyre Basin (LEB). We use a case study of the ˜120 million ha Lake Eyre Basin that comprises some of the most distinctive Australian landscapes, including Uluru-Kata Tjuta National Park. More than 240 non-native plant species are recorded in the Lake Eyre Basin, with many predicted to spread, but there are insufficient resources to control all species. Lake Eyre Basin experts identified 12 strategies to control, contain or eradicate non-native species over the next 50 years. The total cost of the proposed Lake Eyre Basin strategies was estimated at AU$1·7 billion, an average of AU$34 million annually. Implementation of these strategies is estimated to reduce non-native plant dominance by 17 million ha – there would be a 32% reduction in the likely area dominated by non-native plants within 50 years if these strategies were implemented. The three most cost-effective strategies were controlling Parkinsonia aculeata, Ziziphus mauritiana and Prosopis spp. These three strategies combined were estimated to cost only 0·01% of total cost of all the strategies, but would provide 20% of the total benefits. Over 50 years, cost-effective spending of AU$2·3 million could eradicate all non-native plant species from the only threatened ecological community within the Lake Eyre Basin, the Great Artesian Basin discharge springs. Synthesis and applications. Our framework, based on a case study of the ˜120 million ha Lake Eyre Basin in Australia, provides a rationale for financially efficient investment in non-native plant management and reveals combinations of strategies that are optimal for different budgets. It also highlights knowledge gaps and incidental findings that could improve effective management of non-native plants, for example addressing the reliability of species distribution data and prevalence of information sharing across states and regions.
Resumo:
Different human activities like combustion of fossil fuels, biomass burning, industrial and agricultural activities, emit a large amount of particulates into the atmosphere. As a consequence, the air we inhale contains significant amount of suspended particles, including organic and inorganic solids and liquids, as well as various microorganism, which are solely responsible for a number of pulmonary diseases. Developing a numerical model for transport and deposition of foreign particles in realistic lung geometry is very challenging due to the complex geometrical structure of the human lung. In this study, we have numerically investigated the airborne particle transport and its deposition in human lung surface. In order to obtain the appropriate results of particle transport and deposition in human lung, we have generated realistic lung geometry from the CT scan obtained from a local hospital. For a more accurate approach, we have also created a mucus layer inside the geometry, adjacent to the lung surface and added all apposite mucus layer properties to the wall surface. The Lagrangian particle tracking technique is employed by using ANSYS FLUENT solver to simulate the steady-state inspiratory flow. Various injection techniques have been introduced to release the foreign particles through the inlet of the geometry. In order to investigate the effects of particle size on deposition, numerical calculations are carried out for different sizes of particles ranging from 1 micron to 10 micron. The numerical results show that particle deposition pattern is completely dependent on its initial position and in case of realistic geometry; most of the particles are deposited on the rough wall surface of the lung geometry instead of carinal region.
Resumo:
In a three day trial in April 2008, the United States District Court for the Southern District of New York considered whether the Harry Potter Lexicon infringed the intellectual property rights of J.K. Rowling and Warner Brothers. The case has attracted great media attention. As John Crace, a reporter for The Guardian, observed: “On one side: global-celebrity author J.K. Rowling. On the other: an amateur fan site devoted to the world's favourite boy wizard. At stake: the soul of Harry Potter.” J.K. Rowling is the author of the seven book Harry Potter series, which tell the story of a young wizard, Harry Potter, and his battles with Voldemort, the Lord of Darkness. As the court papers noted, “The Harry Potter Books are a modern day publishing phenomenon and success story.” Warner Brothers sought and obtained the film rights to the series. The entertainment company has thus far produced five films; a sixth is due in November 2008; and the final instalment is planned. The Harry Potter Lexicon is a reference guide created by Steven Vander Ark, a former grade school teacher. He has organised a large volume of material on the Harry Potter books and the Harry Potter films on a website in an alphabetical listing, from “A-Z”. The founder of RDR Books, Roger Rapoport, approached Ark to publish the Harry Potter Lexicon in a book form. Ark agreed to this request, and provided the publisher with a condensed version of the web-site. After RDR Books announced its intention to publish the reference book, J.K. Rowling and Warner Brothers brought a legal action in the United States District Court for the Southern District of New York, alleging that the publishers of the Harry Potter Lexicon were in breach of various intellectual property rights. A spokesperson for Warner Brothers and J.K. Rowling observed: "A fan’s affectionate enthusiasm should not obscure acts of plagiarism. The publishers knew what they were doing. The problem remains that the Lexicon takes an enormous amount of Ms. Rowling’s work and adds virtually no original commentary of its own. As we’ve said in court, it takes too much and adds too little. Authors have a duty to prevent the exploitation of their works by people who contribute nothing original, creative or interpretive." The litigation involves the intersection of copyright law, trade mark law, and consumer protection law. It has a wider significance because it deals with the protection of authorial rights; the use of literary indexes, supplements and reference guides; and the clash between character merchandising and fan fiction.
Resumo:
The importance of developing effective disaster management strategies has significantly grown as the world continues to be confronted with unprecedented disastrous events. Factors such as climate instability, recent urbanization along with rapid population growth in many cities around the world have unwittingly exacerbated the risks of potential disasters, leaving a large number of people and infrastructure exposed to new forms of threats from natural disasters such as flooding, cyclones, and earthquakes. With disasters on the rise, effective recovery planning of the built environment is becoming imperative as it is not only closely related to the well-being and essential functioning of society, but it also requires significant financial commitment. In the built environment context, post-disaster reconstruction focuses essentially on the repair and reconstruction of physical infrastructures. The reconstruction and rehabilitation efforts are generally performed in the form of collaborative partnerships that involve multiple organisations, enabling the restoration of interdependencies that exist between infrastructure systems such as energy, water (including wastewater), transport, and telecommunication systems. These interdependencies are major determinants of vulnerabilities and risks encountered by critical infrastructures and therefore have significant implications for post-disaster recovery. When disrupted by natural disasters, such interdependencies have the potential to promote the propagation of failures between critical infrastructures at various levels, and thus can have dire consequences on reconstruction activities. This paper outlines the results of a pilot study on how elements of infrastructure interdependencies have the potential to impede the post-disaster recovery effort. Using a set of unstructured interview questionnaires, plausible arguments provided by seven respondents revealed that during post-disaster recovery, critical infrastructures are mutually dependent on each other’s uninterrupted availability, both physically and through a host of information and communication technologies. Major disruption to their physical and cyber interdependencies could lead to cascading failures, which could delay the recovery effort. Thus, the existing interrelationship between critical infrastructures requires that the entire interconnected network be considered when managing reconstruction activities during the post-disaster recovery period.
Resumo:
It has been 10 years since the seminal paper by Morrison and colleagues reporting the association of alleles of the vitamin D receptor and bone density [1], a paper which arguably kick-started the study of osteoporosis genetics. Since that report there have been literally thousands of osteoporosis genetic studies published, and large numbers of genes have been reported to be associated with the condition [2]. Although some of these reported associations are undoubtedly true, this snow-storm of papers and abstracts has clouded the field to such a great extent that it is very difficult to be certain of the veracity of most genetic associations reported hereto. The field needs to take stock and reconsider the best way forward, taking into account the biology of skeletal development and technological and statistical advances in human genetics, before more effort and money is wasted on continuing a process in which the primary achievement could be said to be a massive paper mountain. I propose in this review that the primary reasons for the paucity of success in osteoporosis genetics has been: •the absence of a major gene effect on bone mineral density (BMD), the most commonly studied bone phenotype; •failure to consider issues such as genetic heterogeneity, gene–environment interaction, and gene–gene interaction; •small sample sizes and over-optimistic data interpretation; and •incomplete assessment of the genetic variation in candidate genes studied.