802 resultados para High content analysis
Resumo:
Aflatoxin B1 (AFB1), ochratoxin A (OTA) and fumonisin B1 (FB1) are important mycotoxins in terms of
human exposure via food, their toxicity and regulatory limits that exist worldwide. Mixtures of toxins can frequently be present in foods, however due to the complications of determining their combined toxicity,
legal limits of exposure are determined for single compounds, based on long standing toxicological
techniques. High content analysis (HCA) may be a useful tool to determine total toxicity of complex
mixtures of mycotoxins. Endpoints including cell number (CN), nuclear intensity (NI), nuclear area (NA),
plasma membrane permeability (PMP), mitochondrial membrane potential (MMP) and mitochondrial
mass (MM) were compared to the conventional 3-(4,5-dimethylthiazol-2-yl)-2,5 diphenyltetrazolium
bromide (MTT) and neutral red (NR) endpoints in MDBK cells. Individual concentrations of each
mycotoxin (OTA 3mg/ml, FB1 8mg/ml and AFB11.28mg/ml) revealed no cytotoxicity with MTTor NR but
HCA showed significant cytotoxic effects up to 41.6% (p0.001) and 10.1% (p0.05) for OTA and AFB1,
respectively. The tertiary mixture (OTA 3mg/ml, FB1 8mg/ml and AFB1 1.28mg/ml) detected up to 37.3%
and 49.8% more cytotoxicity using HCA over MTT and NR, respectively. Whilst binary combinations of
OTA (3mg/ml) and FB1 (8mg/ml) revealed synergistic interactions using HCA (MMP, MM, NI endpoints)
not detected using MTT or NR. HCA is a highly novel and sensitive tool that could substantially help
determine future regulatory limits, for single and combined toxins present in food, ensuring legislation is based on true risks to human health exposure.
Resumo:
Persistent organic pollutants (POPs) are toxic substances, highly resistant to environmental degradation, which can bio-accumulate and have long-range atmospheric transport potential. Most studies focus on single compound effects, however as humans are exposed to several POPs simultaneously, investigating exposure effects of real life POP mixtures on human health is necessary. A defined mixture of POPs was used, where the compound concentration reflected its contribution to the levels seen in Scandinavian human serum (total mix). Several sub mixtures representing different classes of POP were also constructed. The perfluorinated (PFC) mixture contained six perfluorinated compounds, brominated (Br) mixture contained seven brominated compounds, chlorinated (Cl) mixture contained polychlorinated biphenyls and also p,p'-dichlorodiphenyldichloroethylene, hexachlorobenzene, three chlordanes, three hexachlorocyclohexanes and dieldrin. Human hepatocarcinoma (HepG2) cells were used for 2h and 48h exposures to the seven mixtures and analysis on a CellInsight™ NXT High Content Screening platform. Multiple cytotoxic endpoints were investigated: cell number, nuclear intensity and area, mitochondrial mass and membrane potential (MMP) and reactive oxygen species (ROS). Both the Br and Cl mixtures induced ROS production but did not lead to apoptosis. The PFC mixture induced the ROS production and likely induced cell apoptosis accompanied by the dissipation of MMP. Synergistic effects were evident for ROS induction when cells were exposed to the PFC+Br mixture. No significant effects were detected in the Br+Cl, PFC+Cl or total mixtures, which contain the same concentrations of chlorinated compounds as the Cl mixture plus additional compounds; highlighting the need for further exploration of POP mixtures in risk assessment.
Resumo:
Bone marrow mesenchymal stem cells (MSCs) promote nerve growth and functional recovery in animal models of spinal cord injury (SCI) to varying levels. The authors have tested high-content screening to examine the effects of MSC-conditioned medium (MSC-CM) on neurite outgrowth from the human neuroblastoma cell line SH-SY5Y and from explants of chick dorsal root ganglia (DRG). These analyses were compared to previously published methods that involved hand-tracing individual neurites. Both methods demonstrated that MSC-CM promoted neurite outgrowth. Each showed the proportion of SH-SY5Y cells with neurites increased by ~200% in MSC-CM within 48 h, and the number of neurites/SH-SY5Y cells was significantly increased in MSC-CM compared with control medium. For high-content screening, the analysis was performed within minutes, testing multiple samples of MSC-CM and in each case measuring >15,000 SH-SY5Y cells. In contrast, the manual measurement of neurite outgrowth from >200 SH-SY5Y cells in a single sample of MSC-CM took at least 1 h. High-content analysis provided additional measures of increased neurite branching in MSC-CM compared with control medium. MSC-CM was also found to stimulate neurite outgrowth in DRG explants using either method. The application of the high-content analysis was less well optimized for measuring neurite outgrowth from DRG explants than from SH-SY5Y cells.
Resumo:
Drug induced liver injury is one of the frequent reasons for the drug removal from the market. During the recent years there has been a pressure to develop more cost efficient, faster and easier ways to investigate drug-induced toxicity in order to recognize hepatotoxic drugs in the earlier phases of drug development. High Content Screening (HCS) instrument is an automated microscope equipped with image analysis software. It makes the image analysis faster and decreases the risk for an error caused by a person by analyzing the images always in the same way. Because the amount of drug and time needed in the analysis are smaller and multiple parameters can be analyzed from the same cells, the method should be more sensitive, effective and cheaper than the conventional assays in cytotoxicity testing. Liver cells are rich in mitochondria and many drugs target their toxicity to hepatocyte mitochondria. Mitochondria produce the majority of the ATP in the cell through oxidative phosphorylation. They maintain biochemical homeostasis in the cell and participate in cell death. Mitochondria is divided into two compartments by inner and outer mitochondrial membranes. The oxidative phosphorylation happens in the inner mitochondrial membrane. A part of the respiratory chain, a protein called cytochrome c, activates caspase cascades when released. This leads to apoptosis. The aim of this study was to implement, optimize and compare mitochondrial toxicity HCS assays in live cells and fixed cells in two cellular models: human HepG2 hepatoma cell line and rat primary hepatocytes. Three different hepato- and mitochondriatoxic drugs (staurosporine, rotenone and tolcapone) were used. Cells were treated with the drugs, incubated with the fluorescent probes and then the images were analyzed using Cellomics ArrayScan VTI reader. Finally the results obtained after optimizing methods were compared to each other and to the results of the conventional cytotoxicity assays, ATP and LDH measurements. After optimization the live cell method and rat primary hepatocytes were selected to be used in the experiments. Staurosporine was the most toxic of the three drugs and caused most damage to the cells most quickly. Rotenone was not that toxic, but the results were more reproducible and thus it would serve as a good positive control in the screening. Tolcapone was the least toxic. So far the conventional analysis of cytotoxicity worked better than the HCS methods. More optimization needs to be done to get the HCS method more sensitive. This was not possible in this study due to time limit.
Resumo:
The vertical distribution of cloud cover has a significant impact on a large number of meteorological and climatic processes. Cloud top altitude and cloud geometrical thickness are then essential. Previous studies established the possibility of retrieving those parameters from multi-angular oxygen A-band measurements. Here we perform a study and comparison of the performances of future instruments. The 3MI (Multi-angle, Multi-channel and Multi-polarization Imager) instrument developed by EUMETSAT, which is an extension of the POLDER/PARASOL instrument, and MSPI (Multi-angles Spectro-Polarimetric Imager) develoloped by NASA's Jet Propulsion Laboratory will measure total and polarized light reflected by the Earth's atmosphere–surface system in several spectral bands (from UV to SWIR) and several viewing geometries. Those instruments should provide opportunities to observe the links between the cloud structures and the anisotropy of the reflected solar radiation into space. Specific algorithms will need be developed in order to take advantage of the new capabilities of this instrument. However, prior to this effort, we need to understand, through a theoretical Shannon information content analysis, the limits and advantages of these new instruments for retrieving liquid and ice cloud properties, and especially, in this study, the amount of information coming from the A-Band channel on the cloud top altitude (CTOP) and geometrical thickness (CGT). We compare the information content of 3MI A-Band in two configurations and that of MSPI. Quantitative information content estimates show that the retrieval of CTOP with a high accuracy is possible in almost all cases investigated. The retrieval of CGT seems less easy but possible for optically thick clouds above a black surface, at least when CGT > 1–2 km.
Resumo:
BACKGROUND: High intercoder reliability (ICR) is required in qualitative content analysis for assuring quality when more than one coder is involved in data analysis. The literature is short of standardized procedures for ICR procedures in qualitative content analysis. OBJECTIVE: To illustrate how ICR assessment can be used to improve codings in qualitative content analysis. METHODS: Key steps of the procedure are presented, drawing on data from a qualitative study on patients' perspectives on low back pain. RESULTS: First, a coding scheme was developed using a comprehensive inductive and deductive approach. Second, 10 transcripts were coded independently by two researchers, and ICR was calculated. A resulting kappa value of .67 can be regarded as satisfactory to solid. Moreover, varying agreement rates helped to identify problems in the coding scheme. Low agreement rates, for instance, indicated that respective codes were defined too broadly and would need clarification. In a third step, the results of the analysis were used to improve the coding scheme, leading to consistent and high-quality results. DISCUSSION: The quantitative approach of ICR assessment is a viable instrument for quality assurance in qualitative content analysis. Kappa values and close inspection of agreement rates help to estimate and increase quality of codings. This approach facilitates good practice in coding and enhances credibility of analysis, especially when large samples are interviewed, different coders are involved, and quantitative results are presented.
Resumo:
The purpose of this study was to investigate a selection of children's historical nonfiction literature for evidence of coherence. Although research has been conducted on coherence of textbook material and its influences on comprehension there has been limited study on coherence in children's nonfiction literature. Generally, textual coherence has been seen as critical in the comprehensibility of content area textbooks because it concerns the unity of connections among ideas and information. Disciplinary coherence concerns the extent to which authors of historical text show readers how historians think and write. Since young readers are apprentices in learning historical content and conventions of historical thinking, evidence of disciplinary coherence is significant in nonfiction literature for young readers. The sample of the study contained 32 books published between 1989 and 2000 ranging in length from less than 90 pages to more than 150 pages. Content analysis was the quantitative research technique used to measure 84 variables of textual and disciplinary coherence in three passages of each book, as proportions of the total number of words for each book. Reliability analyses and an examination of 750 correlations showed the extent to which variables were related in the books. Three important findings emerged from the study that should be considered in the selection and use of children's historical nonfiction literature in classrooms. First, characteristics of coherence are significantly related together in high quality nonfiction literature. Second, shorter books have a higher proportion of textual coherence than longer books as measured in three passages. Third, presence of the author is related to characteristics of coherence throughout the books. The findings show that nonfiction literature offers students content that researchers have found textbooks lack. Both younger and older students have the opportunity to learn the conventions of historical thinking as they learn content through nonfiction literature. Further, the children's literature, represented in the Orbis Pictus list, shows students that authors select, interpret, and question information, and give other interpretations. The implications of the study for teaching history, teacher preparation in content and literacy, school practices, children's librarians, and publishers of children's nonfiction are discussed.
Resumo:
In this dissertation, the cytogenetic characteristics of bone marrow cells from 41 multiple myeloma patients were investigated. These cytogenetic data were correlated with the total DNA content as measured by flow cytometry. Both the cytogenetic information and DNA content were then correlated with clinical data to determine if diagnosis and prognosis of multiple myeloma could be improved.^ One hundred percent of the patients demonstrated abnormal chromosome numbers per metaphase. The average chromosome number per metaphase ranged from 42 to 49.9, with a mean of 44.99. The percent hypodiploidy ranged from 0-100% and the percent hyperdiploidy from 0-53%. Detailed cytogenetic analyses were very difficult to perform because of the paucity of mitotic figures and the poor chromosome morphology. Thus, detailed chromosome banding analysis on these patients was impossible.^ Thirty seven percent of the patients had normal total DNA content, whereas 63% had abnormal amounts of DNA (one patient with less than normal amounts and 25 patients with greater than normal amounts of DNA).^ Several clinical parameters were used in the statistical analyses: tumor burden, patient status at biopsy, patient response status, past therapy, type of treatment and percent plasma cells. Only among these clinical parameters were any statistically significant correlations found: pretreatment tumor burden versus patient response, patient biopsy status versus patient response and past therapy versus patient response.^ No correlations were found between percent hypodiploid, diploid, hyperdiploid or DNA content, and the patient response status, nor were any found between those patients with: (a) normal plasma cells, low pretreatment tumor mass burden and more than 50% of the analyzed metaphases with 46 chromosomes; (b) normal amounts of DNA, low pretreatment tumor mass burden and more than 50% of the metaphases with 46 chromosomes; (c) normal amounts of DNA and normal quantities of plasma cells; (d) abnormal amounts of DNA, abnormal amounts of plasma cells, high pretreatment tumor mass burden and less than 50% of the metaphases with 46 chromosomes.^ Technical drawbacks of both cytogenetic and DNA content analysis in these multiple myeloma patients are discussed along with the lack of correlations between DNA content and chromosome number. Refined chromosome banding analysis awaits technical improvements before we can understand which chromosome material (if any) makes up the "extra" amounts of DNA in these patients. None of the correlations tested can be used as diagnostic or prognostic aids for multiple myeloma. ^
Resumo:
This dissertation is a study of customer relationship management theory and practice. Customer Relationship Management (CRM) is a business strategy whereby companies build strong relationships with existing and prospective customers with the goal of increasing organizational profitability. It is also a learning process involving managing change in processes, people, and technology. CRM implementation and its ramifications are also not completely understood as evidenced by the high number of failures in CRM implementation in organizations and the resulting disappointments. ^ The goal of this dissertation is to study emerging issues and trends in CRM, including the effect of computer software and the accompanying new management processes on organizations, and the dynamics of the alignment of marketing, sales and services, and all other functions responsible for delivering customers a satisfying experience. ^ In order to understand CRM better a content analysis of more than a hundred articles and documents from academic and industry sources was undertaken using a new methodological twist to the traditional method. An Internet domain name (http://crm.fiu.edu) was created for the purpose of this research by uploading an initial one hundred plus abstracts of articles and documents onto it to form a knowledge database. Once the database was formed a search engine was developed to enable the search of abstracts using relevant CRM keywords to reveal emergent dominant CRM topics. The ultimate aim of this website is to serve as an information hub for CRM research, as well as a search engine where interested parties can enter CRM-relevant keywords or phrases to access abstracts, as well as submit abstracts to enrich the knowledge hub. ^ Research questions were investigated and answered by content analyzing the interpretation and discussion of dominant CRM topics and then amalgamating the findings. This was supported by comparisons within and across individual, paired, and sets-of-three occurrences of CRM keywords in the article abstracts. ^ Results show that there is a lack of holistic thinking and discussion of CRM in both academics and industry which is required to understand how the people, process, and technology in CRM impact each other to affect successful implementation. Industry has to get their heads around CRM and holistically understand how these important dimensions affect each other. Only then will organizational learning occur, and overtime result in superior processes leading to strong profitable customer relationships and a hard to imitate competitive advantage. ^
Resumo:
This study examines the triple bottom line of sustainability, in the context of both profit-oriented and non-profit oriented organizations. Sustainability is a compound result of interaction between economic, environmental, and social dimensions. Sustainability cannot be achieved without balance between all three dimensions, which has implications for measuring sustainability and prioritizing goals. This study demonstrates a method for measuring organizational sustainability achievement in these three dimensions of sustainability. Content analysis of the annual reports of corporations from the United States, Continental Europe (and Scandinavia), and Asia reveals that the economic dimension remains the preeminent aspect, and corporations still have a long way to go to reach comprehensive sustainability by maintaining a balance between the three dimensions of sustainability. The analysis also shows a high level of isomorphism in the sustainability practices of corporations, suggesting that even the most sustainable corporations are taking a somewhat passive role in prioritizing sustainability goals. A list of 25 terms for each dimension of sustainability (economic, environmental, and social) has been developed which can be used by corporations to develop and communicate their sustainability practices most effectively to the maximum number of their stakeholders. In contrast, botanical gardens demonstrate more balance among the three dimensions of sustainability.
Resumo:
Search engines have forever changed the way people access and discover knowledge, allowing information about almost any subject to be quickly and easily retrieved within seconds. As increasingly more material becomes available electronically the influence of search engines on our lives will continue to grow. This presents the problem of how to find what information is contained in each search engine, what bias a search engine may have, and how to select the best search engine for a particular information need. This research introduces a new method, search engine content analysis, in order to solve the above problem. Search engine content analysis is a new development of traditional information retrieval field called collection selection, which deals with general information repositories. Current research in collection selection relies on full access to the collection or estimations of the size of the collections. Also collection descriptions are often represented as term occurrence statistics. An automatic ontology learning method is developed for the search engine content analysis, which trains an ontology with world knowledge of hundreds of different subjects in a multilevel taxonomy. This ontology is then mined to find important classification rules, and these rules are used to perform an extensive analysis of the content of the largest general purpose Internet search engines in use today. Instead of representing collections as a set of terms, which commonly occurs in collection selection, they are represented as a set of subjects, leading to a more robust representation of information and a decrease of synonymy. The ontology based method was compared with ReDDE (Relevant Document Distribution Estimation method for resource selection) using the standard R-value metric, with encouraging results. ReDDE is the current state of the art collection selection method which relies on collection size estimation. The method was also used to analyse the content of the most popular search engines in use today, including Google and Yahoo. In addition several specialist search engines such as Pubmed and the U.S. Department of Agriculture were analysed. In conclusion, this research shows that the ontology based method mitigates the need for collection size estimation.
Resumo:
In a resource constrained business world, strategic choices must be made on process improvement and service delivery. There are calls for more agile forms of enterprises and much effort is being directed at moving organizations from a complex landscape of disparate application systems to that of an integrated and flexible enterprise accessing complex systems landscapes through service oriented architecture (SOA). This paper describes the analysis of strategies to detect supporting business services. These services can then be delivered in a variety of ways: web-services, new application services or outsourced services. The focus of this paper is on strategy analysis to identify those strategies that are common to lines of business and thus can be supported through shared services. A case study of a state government is used to show the analytical method and the detection of shared strategies.
Resumo:
Business Process Management (BPM) is a top priority in organisations and is rapidly proliferating as an emerging discipline in practice. However, the current studies show lack of appropriate BPM skilled professionals in the field and a dearth of opportunities to develop BPM expertise. This paper analyses the gap between available BPM-related education in Australia and required BPM capabilities. BPM courses offered by Australian universities and training institutions have been critically analysed and mapped against leading BPM capability frameworks to determine how well current BPM education and training offerings in Australia actually address the core capabilities required for BPM professionals. The outcomes reported here can be used by Australian universities and training institutions to better align and position their training materials to the BPM required capabilities. It could also be beneficial to individuals looking for a systematic and in-depth understanding of BPM capabilities and trainings.