60 resultados para Primary Market Research: Its Role in Feasibility Studies
Resumo:
This paper investigates the applications of capture–recapture methods to human populations. Capture–recapture methods are commonly used in estimating the size of wildlife populations but can also be used in epidemiology and social sciences, for estimating prevalence of a particular disease or the size of the homeless population in a certain area. Here we focus on estimating the prevalence of infectious diseases. Several estimators of population size are considered: the Lincoln–Petersen estimator and its modified version, the Chapman estimator, Chao’s lower bound estimator, the Zelterman’s estimator, McKendrick’s moment estimator and the maximum likelihood estimator. In order to evaluate these estimators, they are applied to real, three-source, capture-recapture data. By conditioning on each of the sources of three source data, we have been able to compare the estimators with the true value that they are estimating. The Chapman and Chao estimators were compared in terms of their relative bias. A variance formula derived through conditioning is suggested for Chao’s estimator, and normal 95% confidence intervals are calculated for this and the Chapman estimator. We then compare the coverage of the respective confidence intervals. Furthermore, a simulation study is included to compare Chao’s and Chapman’s estimator. Results indicate that Chao’s estimator is less biased than Chapman’s estimator unless both sources are independent. Chao’s estimator has also the smaller mean squared error. Finally, the implications and limitations of the above methods are discussed, with suggestions for further development.
Resumo:
Over the years, the MCF7 human breast cancer cell line has provided a model system for the study of cellular and molecular mechanisms in oestrogen regulation of cell proliferation and in progression to oestrogen and antioestrogen independent growth. Global gene expression profiling has shown that oestrogen action in MCF7 cells involves the coordinated regulation of hundreds of genes across a wide range of functional groupings and that more genes are down regulated than upregulated. Adaptation to long-term oestrogen deprivation, which results in loss of oestrogen-responsive growth, involves alterations to gene patterns not only at early time points (0-4 weeks) but continuing through to later times (20-55 weeks), and even involves alterations to patterns of oestrogen-regulated gene expression. Only 48% of the genes which were regulated >= 2-fold by oestradiol in oestrogen-responsive cells retained this responsiveness after long-term oestrogen deprivation but other genes developed de novo oestrogen regulation. Long-term exposure to fulvestrant, which resulted in loss of growth inhibition by the antioestrogen, resulted in some very large fold changes in gene expression up to 10,000-fold. Comparison of gene profiles produced by environmental chemicals with oestrogenic properties showed that each ligand gave its own unique expression profile which suggests that environmental oestrogens entering the human breast may give rise to a more complex web of interference in cell function than simply mimicking oestrogen action at inappropriate times. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Alpha-, beta- and gamma-melanocyte stimulating hormones (MSHs) are peptides derived from the ACTH precursor, pro-opiomelanocortin. All three peptides have been highly conserved throughout evolution but their exact biological function in mammals is still largely obscure. In recent years, there has been a surge of interest in alpha-MSH and its role in the regulation of feeding. Gamma-MSH by contrast has been shown to be involved in the regulation of adrenal steroidogenesis and also has effects on the cardiovascular and renal systems. This review will provide an overview of the role that gamma-MSH peptides play in the regulation of adrenal steroidogenesis. (c) 2005 Elsevier Inc. All rights reserved.
Resumo:
Europe's commitment to language learning has resulted in higher percentages of pupils studying foreign languages during primary education. In England, recent policy decisions to expand foreign language learning at primary level by 2010 create major implications for transition to secondary. This paper presents findings on transition issues from case studies of a DfES-funded project evaluating 19 local authority Pathfinders piloting the introduction of foreign language learning at primary level. Research on transition in other countries sets these findings in context. Finally, it investigates the challenges England faces for transition in the light of this expansion and discusses future implications.
Resumo:
Reports that heat processing of foods induces the formation of acrylamide heightened interest in the chemistry, biochemistry, and safety of this compound. Acrylamide-induced neurotoxicity, reproductive toxicity, genotoxicity, and carcinogenicity are potential human health risks based on animal studies. Because exposure of humans to acrylamide can come from both external sources and the diet, there exists a need to develop a better understanding of its formation and distribution in food and its role in human health. To contribute to this effort, experts from eight countries have presented data on the chemistry, analysis, metabolism, pharmacology, and toxicology of acrylamide. Specifically covered are the following aspects: exposure from the environment and the diet; biomarkers of exposure; risk assessment; epidemiology; mechanism of formation in food; biological alkylation of amino acids, peptides, proteins, and DNA by acrylamide and its epoxide metabolite glycidamide; neurotoxicity, reproductive toxicity, and carcinogenicity; protection against adverse effects; and possible approaches to reducing levels in food. Cross-fertilization of ideas among several disciplines in which an interest in acrylamide has developed, including food science, pharmacology, toxicology, and medicine, will provide a better understanding of the chemistry and biology of acrylamide in food, and can lead to the development of food processes to decrease the acrylamide content of the diet.
Resumo:
The inaugural meeting of the International Scientific Association for Probiotics and Prebiotics (ISAPP) was held May 3 to May 5 2002 in London, Ontario, Canada. A group of 63 academic and industrial scientists from around the world convened to discuss current issues in the science of probiotics and prebiotics. ISAPP is a non-profit organization comprised of international scientists whose intent is to strongly support and improve the levels of scientific integrity and due diligence associated with the study, use, and application of probiotics and prebiotics. In addition, ISAPP values its role in facilitating communication with the public and healthcare providers and among scientists in related fields on all topics pertinent to probiotics and prebiotics. It is anticipated that such efforts will lead to development of approaches and products that are optimally designed for the improvement of human and animal health and well being. This article is a summary of the discussions, conclusions, and recommendations made by 8 working groups convened during the first ISAPP workshop focusing on the topics of: definitions, intestinal flora, extra-intestinal sites, immune function, intestinal disease, cancer, genetics and genomics, and second generation prebiotics. Humans have evolved in symbiosis with an estimated 1014 resident microorganisms. However, as medicine has widely defined and explored the perpetrators of disease, including those of microbial origin, it has paid relatively little attention to the microbial cells that constitute the most abundant life forms associated with our body. Microbial metabolism in humans and animals constitutes an intense biochemical activity in the body, with profound repercussions for health and disease. As understanding of the human genome constantly expands, an important opportunity will arise to better determine the relationship between microbial populations within the body and host factors (including gender, genetic background, and nutrition) and the concomitant implications for health and improved quality of life. Combined human and microbial genetic studies will determine how such interactions can affect human health and longevity, which communication systems are used, and how they can be influenced to benefit the host. Probiotics are defined as live microorganisms which, when administered in adequate amounts confer a health benefit on the host.1 The probiotic concept dates back over 100 years, but only in recent times have the scientific knowledge and tools become available to properly evaluate their effects on normal health and well being, and their potential in preventing and treating disease. A similar situation exists for prebiotics, defined by this group as non-digestible substances that provide a beneficial physiological effect on the host by selectively stimulating the favorable growth or activity of a limited number of indigenous bacteria. Prebiotics function complementary to, and possibly synergistically with, probiotics. Numerous studies are providing insights into the growth and metabolic influence of these microbial nutrients on health. Today, the science behind the function of probiotics and prebiotics still requires more stringent deciphering both scientifically and mechanistically. The explosion of publications and interest in probiotics and prebiotics has resulted in a body of collective research that points toward great promise. However, this research is spread among such a diversity of organisms, delivery vehicles (foods, pills, and supplements), and potential health targets such that general conclusions cannot easily be made. Nevertheless, this situation is rapidly changing on a number of important fronts. With progress over the past decade on the genetics of lactic acid bacteria and the recent, 2,3 and pending, 4 release of complete genome sequences for major probiotic species, the field is now armed with detailed information and sophisticated microbiological and bioinformatic tools. Similarly, advances in biotechnology could yield new probiotics and prebiotics designed for enhanced or expanded functionality. The incorporation of genetic tools within a multidisciplinary scientific platform is expected to reveal the contributions of commensals, probiotics, and prebiotics to general health and well being and explicitly identify the mechanisms and corresponding host responses that provide the basis for their positive roles and associated claims. In terms of human suffering, the need for effective new approaches to prevent and treat disease is paramount. The need exists not only to alleviate the significant mortality and morbidity caused by intestinal diseases worldwide (especially diarrheal diseases in children), but also for infections at non-intestinal sites. This is especially worthy of pursuit in developing nations where mortality is too often the outcome of food and water borne infection. Inasmuch as probiotics and prebiotics are able to influence the populations or activities of commensal microflora, there is evidence that they can also play a role in mitigating some diseases. 5,6 Preliminary support that probiotics and prebiotics may be useful as intervention in conditions including inflammatory bowel disease, irritable bowel syndrome, allergy, cancer (especially colorectal cancer of which 75% are associated with diet), vaginal and urinary tract infections in women, kidney stone disease, mineral absorption, and infections caused by Helicobacter pylori is emerging. Some metabolites of microbes in the gut may also impact systemic conditions ranging from coronary heart disease to cognitive function, suggesting the possibility that exogenously applied microbes in the form of probiotics, or alteration of gut microecology with prebiotics, may be useful interventions even in these apparently disparate conditions. Beyond these direct intervention targets, probiotic cultures can also serve in expanded roles as live vehicles to deliver biologic agents (vaccines, enzymes, and proteins) to targeted locations within the body. The economic impact of these disease conditions in terms of diagnosis, treatment, doctor and hospital visits, and time off work exceeds several hundred billion dollars. The quality of life impact is also of major concern. Probiotics and prebiotics offer plausible opportunities to reduce the morbidity associated with these conditions. The following addresses issues that emerged from 8 workshops (Definitions, Intestinal Flora, Extra-Intestinal Sites, Immune Function, Intestinal Disease, Cancer, Genomics, and Second Generation Prebiotics), reflecting the current scientific state of probiotics and prebiotics. This is not a comprehensive review, however the study emphasizes pivotal knowledge gaps, and recommendations are made as to the underlying scientific and multidisciplinary studies that will be required to advance our understanding of the roles and impact of prebiotics, probiotics, and the commensal microflora upon health and disease management.
Resumo:
The International Conference (series) on Disability, Virtual Reality and Associated Technologies (ICDVRAT) this year held its sixth biennial conference, celebrating ten years of research and development in this field. A total of 220 papers have been presented at the first six conferences, addressing potential, development, exploration and examination of how these technologies can be applied in disabilities research and practice. The research community is broad and multi-disciplined, comprising a variety of scientific and medical researchers, rehabilitation therapists, educators and practitioners. Likewise, technologies, their applications and target user populations are also broad, ranging from sensors positioned on real world objects to fully immersive interactive simulated environments. A common factor is the desire to identify what the technologies have to offer and how they can provide added value to existing methods of assessment, rehabilitation and support for individuals with disabilities. This paper presents a brief review of the first decade of research and development in the ICDVRAT community, defining technologies, applications and target user populations served.
Resumo:
The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.
Resumo:
This paper examines the regional investment practices of institutional investors in the commercial real estate office market in 1998 and 2003 in England and Wales. Consistent with previous studies in the US the findings show that investors concentrate their holdings in a few (urban) areas and that this concentration has become more pronounced as investors have rationalised their portfolio holdings. The findings also indicate that office investment does not fully correlate with the UK urban hierarchy, as measured by population, but is focused on urban areas with high service sector employment. Finally, the pre-eminence of the City of London and and West End office markets as the key focus of institutional investment is confirmed.
Resumo:
Van der Heijden’s ENDGAME STUDY DATABASE IV, HHDBIV, is the definitive collection of 76,132 chess studies. The zugzwang position or zug, one in which the side to move would prefer not to, is a frequent theme in the literature of chess studies. In this third data-mining of HHDBIV, we report on the occurrence of sub-7-man zugs there as discovered by the use of CQL and Nalimov endgame tables (EGTs). We also mine those Zugzwang Studies in which a zug more significantly appears in both its White-to-move (wtm) and Black-to-move (btm) forms. We provide some illustrative and extreme examples of zugzwangs in studies.
Resumo:
This paper considers the role of social capital and trust in the aspirations for higher education of a group of socially disadvantaged girls. Drawing on data from a longitudinal, ethnographic case study of an underperforming secondary school, the paper considers current conceptualisations of social capital and its role in educational ambitions. The paper concludes by tentatively suggesting that whilst social capital is extremely helpful in explaining differences within groups, trust appears to be a pre-requisite for the investment and generation of social capital, as opposed to the other way around. The paper also suggests that young people are not necessarily dependent on their families for their social capital but are able to generate capital in their own right.
Resumo:
The effects and influence of the Building Research Establishment’s Environmental Assessment Methods (BREEAM) on construction professionals are examined. Most discussions of building assessment methods focus on either the formal tool or the finished product. In contrast, BREEAM is analysed here as a social technology using Michel Foucault’s theory of governmentality. Interview data are used to explore the effect of BREEAM on visibilities, knowledge, techniques and professional identities. The analysis highlights a number of features of the BREEAM assessment process which generally go unremarked: professional and public understandings of the method, the deployment of different types of knowledge and their implication for the authority and legitimacy of the tool, and the effect of BREEAM on standard practice. The analysis finds that BREEAM’s primary effect is through its impact on standard practices. Other effects include the use of assessment methods to defend design decisions, its role in both operationalizing and obscuring the concept of green buildings, and the effect of tensions between project and method requirements for the authority of the tool. A reflection on assessment methods as neo-liberal tools and their adequacy for the promotion of sustainable construction suggests several limitations of lock-in that hinder variation and wider systemic change.