193 resultados para Cholesterol content
em Queensland University of Technology - ePrints Archive
Resumo:
Purpose. To establish a simple and rapid analytical method, based on direct insertion/electron ionization-mass spectrometry (DI/EI-MS), for measuring free cholesterol in tears from humans and rabbits. Methods. A stable-isotope dilution protocol employing DI/EI-MS in selected ion monitoring mode was developed and validated. It was used to quantify the free cholesterol content in human and rabbit tear extracts. Tears were collected from adult humans (n = 15) and rabbits (n = 10) and lipids extracted. Results. Screening, full-scan (m/z 40-600) DI/EI-MS analysis of crude tear extracts showed that diagnostic ions located in the mass range m/z 350 to 400 were those derived from free cholesterol, with no contribution from cholesterol esters. DI/EI-MS data acquired using selected ion monitoring (SIM) were analyzed for the abundance ratios of diagnostic ions with their stable isotope-labeled analogues arising from the D6-cholesterol internal standard. Standard curves of good linearity were produced and an on-probe limit of detection of 3 ng (at 3:1 signal to noise) and limit of quantification of 8 ng (at 10:1 signal to noise). The concentration of free cholesterol in human tears was 15 ± 6 μg/g, which was higher than in rabbit tears (10 ± 5 μg/g). Conclusions. A stable-isotope dilution DI/EI-SIM method for free cholesterol quantification without prior chromatographic separation was established. Using this method demonstrated that humans have higher free cholesterol levels in their tears than rabbits. This is in agreement with previous reports. This paper provides a rapid and reliable method to measure free cholesterol in small-volume clinical samples. © 2013 The Association for Research in Vision and Ophthalmology, Inc.
Resumo:
Electrospray ionisation tandem mass spectrometry has allowed the unambiguous identification and quantification of individual lens phospholipids in human and six animal models. Using this approach ca. 100 unique phospholipids have been characterised. Parallel analysis of the same lens extracts by a novel direct-insertion electron-ionization technique found the cholesterol content of human lenses to be significantly higher (ca. 6 times) than lenses from the other animals. The most abundant phospholipids in all the lenses examined were choline-containing phospholipids. In rat, mouse, sheep, cow, pig and chicken, these were present largely as phosphatidylcholines, in contrast 66% of the total phospholipid in Homo sapiens was sphingomyelin, with the most abundant being dihydrosphingomyelins, in particular SM(d18:0/16:0) and SM(d18:0/24:1). The abundant glycerophospholipids within human lenses were found to be predominantly phosphatidylethanolamines and phosphatidylserines with surprisingly high concentrations of ether-linked alkyl chains identified in both classes. This study is the first to identify the phospholipid class (head-group) and assign the constituent fatty acid(s) for each lipid molecule and to quantify individual lens phospholipids using internal standards. These data clearly indicate marked differences in the membrane lipid composition of the human lens compared to commonly used animal models and thus predict a significant variation in the membrane properties of human lens fibre cells compared to those of other animals. © 2008 Elsevier B.V. All rights reserved.
Resumo:
In studies of media industries, too much attention has been paid to providers and firms, too little to consumers and markets. But with user-created content, the question first posed more than a generation ago by the uses & gratifications method and taken up by semiotics and the active audience tradition (‘what do audiences do with media?’), has resurfaced with renewed force. What’s new is that where this question (of what the media industries and audiences did with each other) used to be individualist and functionalist, now, with the advent of social networks using Web 2.0 affordances, it can be re-posed at the level of systems and populations as well.
Resumo:
Cholesterol-lowering treatment by statins is an important and costly issue; however, its role in stroke has not been well documented. The aim of the present study was to review literature and current practice regarding cholesterol-lowering treatment for stroke patients. A literature review was conducted on lipids in stroke and their management with both statins and diet, including the cost-effectiveness of medical nutrition therapy. Qualifying criteria and prescription procedures of the Pharmaceutical Benefits Scheme (PBS) were also reviewed. Data on lipid levels and statin prescriptions were analysed for 468 patients admitted to a stroke unit. The literature shows that management with both medication and diet can be effective, especially when combined; however, 60% of patients with an ischaemic event had fasting total cholesterol measures ≥4 mmol/L (n = 231), with only 52% prescribed statins on discharge (n = 120). Hypercholesterolaemia is an underdiagnosed and undertreated risk factor within the stroke population. It appears that the PBS has not kept pace with advances in the evidence in terms of statin use in the stroke population, and review is needed. The present review should address the qualifying criteria for the stroke population and recommendations on referral to dietitians for dietary advice. Cholesterol-lowering treatment for both stroke patients and the wider population is an area that needs awareness raising and review by the PBS, medical practitioners and dietitians. The role of dietary and pharmacological treatments needs to be clearly defined, including adjunct therapy, and the cost-effectiveness of medical nutrition therapy realised.
Resumo:
Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.
Resumo:
There exists a general consensus in the science education literature around the goal of enhancing students. and teachers. views of nature of science (NOS). An emerging area of research in science education explores NOS and argumentation, and the aim of this study was to explore the effectiveness of a science content course incorporating explicit NOS and argumentation instruction on preservice primary teachers. views of NOS. A constructivist perspective guided the study, and the research strategy employed was case study research. Five preservice primary teachers were selected for intensive investigation in the study, which incorporated explicit NOS and argumentation instruction, and utilised scientific and socioscientific contexts for argumentation to provide opportunities for participants to apply their NOS understandings to their arguments. Four primary sources of data were used to provide evidence for the interpretations, recommendations, and implications that emerged from the study. These data sources included questionnaires and surveys, interviews, audio- and video-taped class sessions, and written artefacts. Data analysis involved the formation of various assertions that informed the major findings of the study, and a variety of validity and ethical protocols were considered during the analysis to ensure the findings and interpretations emerging from the data were valid. Results indicated that the science content course was effective in enabling four of the five participants. views of NOS to be changed. All of the participants expressed predominantly limited views of the majority of the examined NOS aspects at the commencement of the study. Many positive changes were evident at the end of the study with four of the five participants expressing partially informed and/or informed views of the majority of the examined NOS aspects. A critical analysis of the effectiveness of the various course components designed to facilitate the development of participants‟ views of NOS in the study, led to the identification of three factors that mediated the development of participants‟ NOS views: (a) contextual factors (including context of argumentation, and mode of argumentation), (b) task-specific factors (including argumentation scaffolds, epistemological probes, and consideration of alternative data and explanations), and (c) personal factors (including perceived previous knowledge about NOS, appreciation of the importance and utility value of NOS, and durability and persistence of pre-existing beliefs). A consideration of the above factors informs recommendations for future studies that seek to incorporate explicit NOS and argumentation instruction as a context for learning about NOS.
Resumo:
Archaeology provides a framework of analysis and interpretation that is useful for disentangling the textual layers of a contemporary lived-in urban space. The producers and readers of texts may include those who planned and developed the site and those who now live, visit and work there. Some of the social encounters and content sharing between these people may be artificially produced or manufactured in the hope that certain social situations will occur. Others may be serendipitous. With archaeology’s original focus on places that are no longer inhabited it is often only the remaining artefacts and features of the built environment that form the basis for interpreting the social relationships of past people. Our analysis however, is framed within a contemporary notion of archaeological artefacts in an urban setting. Unlike an excavation, where the past is revealed through digging into the landscape, the application of landscape archaeology within a present day urban context is necessarily more experiential, visual and based on recording and analysing the physical traces of social encounters and relationships between residents and visitors. These physical traces are present within the creative content, and the built and natural elements of the environment. This chapter explores notions of social encounters and content sharing in an urban village by analysing three different types of texts: the design of the built environment; content produced by residents through a geospatial web application; and, print and online media produced in digital storytelling workshops.
Resumo:
An exploratory case study which seeks to understand better the problem of low participation rates of women in Information Communication Technology (ICT) is currently being conducted in Queensland, Australia. Contextualised within the Digital Content Industry (DCI) multimedia and games production sectors, the emphasis is on women employed as interactive content creators rather than as users of the technologies. Initial findings provide rich descriptive insights into the perceptions and experiences of female DCI professionals. Influences on participation such as: existing gender ratios, gender and occupational stereotypes, access into the industry and future parental responsibilities have emerged from the data. Bandura’s (1999) Social Cognitive Theory (SCT) is used as a “scaffold” (Walsham, 1995:76) to guide data analysis and assist analytic generalisation of the case study findings. We propose that the lens of human agency and theories such as SCT assist in explaining how influences are manifested and affect women’s agency and ultimately participation in the DCI. The Sphere of Influence conceptual model (Geneve et al, 2008), which emerges from the data and underpinning theory, is proposed as a heuristic framework to further explore influences on women’s participation in the DCI industry context.