943 resultados para Definition in terminology
Resumo:
In Hobbesian terminology, ‘unwritten laws’ are natural laws enforced within a polity, by a non-sovereign judge, without some previous public promulgation. This article discusses the idea in the light of successive Hobbesian accounts of ‘law’ and ‘obligation’. Between De Cive and Leviathan, Hobbes dropped the idea that natural law is strictly speaking law, but he continued to believe unwritten laws must form a part of any legal system. He was unable to explain how such a law could claim a legal status. His loyalty to the notion, in spite of all the trouble that it caused, is a sign of his belief that moral knowledge is readily accessible to all.
Resumo:
The North Atlantic oscillation (NAO) is under current climate conditions the leading mode of atmospheric circulation variability over the North Atlantic region. While the pattern is present during the entire year, it is most important during winter, explaining a large part of the variability of the large-scale pressure field, being thus largely determinant for the weather conditions over the North Atlantic basin and over Western Europe. In this study, a review of recent literature on the basic understanding of the NAO, its variability on different time scales and driving physical mechanisms is presented. In particular, the observed NAO variations and long-term trends are put into a long term perspective by considering paleo-proxy evidence. A representative number of recently released NAO reconstructions are discussed. While the reconstructions agree reasonably well with observations during the instrumental overlapping period, there is a rather high uncertainty between the different reconstructions for the pre-instrumental period, which leads to partially incoherent results, that is, periods where the NAO reconstructions do not agree even in sign. Finally, we highlight the future need of a broader definition of the NAO, the assessment of the stability of the teleconnection centers over time, the analysis of the relations to other relevant variables like temperature and precipitation, as well as on the relevant processes involved
Resumo:
It is now established that native language affects one's perception of the world. However, it is unknown whether this effect is merely driven by conscious, language-based evaluation of the environment or whether it reflects fundamental differences in perceptual processing between individuals speaking different languages. Using brain potentials, we demonstrate that the existence in Greek of 2 color terms—ghalazio and ble—distinguishing light and dark blue leads to greater and faster perceptual discrimination of these colors in native speakers of Greek than in native speakers of English. The visual mismatch negativity, an index of automatic and preattentive change detection, was similar for blue and green deviant stimuli during a color oddball detection task in English participants, but it was significantly larger for blue than green deviant stimuli in native speakers of Greek. These findings establish an implicit effect of language-specific terminology on human color perception.
Resumo:
The validity of the linguistic relativity principle continues to stimulate vigorous debate and research. The debate has recently shifted from the behavioural investigation arena to a more biologically grounded field, in which tangible physiological evidence for language effects on perception can be obtained. Using brain potentials in a colour oddball detection task with Greek and English speakers, a recent study suggests that language effects may exist at early stages of perceptual integration [Thierry, G., Athanasopoulos, P., Wiggett, A., Dering, B., & Kuipers, J. (2009). Unconscious effects of language-specific terminology on pre-attentive colour perception. Proceedings of the National Academy of Sciences, 106, 4567–4570]. In this paper, we test whether in Greek speakers exposure to a new cultural environment (UK) with contrasting colour terminology from their native language affects early perceptual processing as indexed by an electrophysiological correlate of visual detection of colour luminance. We also report semantic mapping of native colour terms and colour similarity judgements. Results reveal convergence of linguistic descriptions, cognitive processing, and early perception of colour in bilinguals. This result demonstrates for the first time substantial plasticity in early, pre-attentive colour perception and has important implications for the mechanisms that are involved in perceptual changes during the processes of language learning and acculturation.
Resumo:
Each human body plays host to a microbial population which is both numerically vast (at around 1014 microbial cells) and phenomenally diverse (over 1,000 species). The majority of the microbial species in the gut have not been cultured but the application of culture-independent approaches for high throughput diversity and functionality analysis has allowed characterisation of the diverse microbial phylotypes present in health and disease. Studies in monozygotic twins, showing that these retain highly similar microbiota decades after birth and initial colonisation, are strongly indicative that diversity of the microbiome is host-specific and affected by the genotype. Microbial diversity in the human body is reflected in both richness and evenness. Diversity increases steeply from birth reaching its highest point in early adulthood, before declining in older age. However, in healthy subjects there appears to be a core of microbial phylotypes which remains relatively stable over time. Studies of individuals from diverse geopraphies suggest that clusters of intestinal bacterial groups tend to occur together, constituting ‘enterotypes’. So variation in intestinal microbiota is stratified rather than continuous and there may be a limited number of host/microbial states which respond differently to environmental influences. Exploration of enterotypes and functional groups may provide biomarkers for disease and insights into the potential for new treatments based on manipulation of the microbiome. In health, the microbiota interact with host defences and exist in harmonious homeostasis which can then be disturbed by invading organisms or when ‘carpet bombing’ by antibiotics occurs. In a portion of individuals with infections, the disease will resolve itself without the need for antibiotics and microbial homeostasis with the host’s defences is restored. The administration of probiotics (live microorganisms which when administered in adequate amounts confer a health benefit on the host) represents an artificial way to enhance or stimulate these natural processes. The study of innate mechanisms of antimicrobial defence on the skin, including the production of numerous antimicrobial peptides (AMPs), has shown an important role for skin commensal organisms. These organisms may produce AMPs, and also amplify the innate immune responses to pathogens by activating signalling pathways and processing host produced AMPs. Research continues into how to enhance and manipulate the role of commensal organisms on the skin. The challenges of skin infection (including diseases caused by multiply resistant organisms) and infestations remain considerable. The potential to re-colonise the skin to replace or reduce pathogens, and exploring the relationship between microbiota elsewhere and skin diseases are among a growing list of research targets. Lactobacillus species are among the best known ‘beneficial’ bacterial members of the human microbiota. Of the approximately 120 species known, about 15 are known to occur in the human vagina. These organisms have multiple properties, including the production of lactic acid, hydrogen peroxide and bacteriocins, which render the vagina inhospitable to potential pathogens. Depletion of the of the normal Lactobacillus population and overgrowth of vaginal anaerobes, accompanied by the loss of normal vaginal acidity can lead to bacterial vaginosis – the commonest cause of abnormal vaginal discharge in women. Some vaginal anaerobes are associated with the formation of vaginal biofilms which serve to act as a reservoir of organisms which persists after standard antibiotic therapy of bacterial vaginosis and may help to account for the characteristically high relapse rate in the condition. Administration of Lactobacillus species both vaginally and orally have shown beneficial effects in the treatment of bacterial vaginosis and such treatments have an excellent overall safety record. Candida albicans is a frequent coloniser of human skin and mucosal membranes, and is a normal part of the microbiota in the mouth, gut and vagina. Nevertheless Candida albicans is the most common fungal pathogen worldwide and is a leading cause of serious and often fatal nosocomial infections. What turns this organism from a commensal to a pathogen is a combination of increasing virulence in the organism and predisposing host factors that compromise immunity. There has been considerable research into the use of probiotic Lactobacillus spp. in vaginal candidiasis. Studies in reconstituted human epithelium and monolayer cell cultures have shown that L. rhamnosus GG can protect mucosa from damage caused by Candida albicans, and enhance the immune responses of mucosal surfaces. Such findings offer the promise that the use of such probiotic bacteria could provide new options for antifungal therapy. Studies of changes of the human intestinal microbiota in health and disease are complicated by its size and diversity. The Alimentary Pharmabiotic Centre in Cork (Republic of Ireland) has the mission to ‘mine microbes for mankind’ and its work illustrates the potential benefits of understanding the gut microbiota. Work undertaken at the centre includes: mapping changes in the microbiota with age; studies of the interaction between the microbiota and the gut; potential interactions between the gut microbiota and the central nervous system; the potential for probiotics to act as anti-infectives including through the production of bacteriocins; and the characterisation of interactions between gut microbiota and bile acids which have important roles as signalling molecules and in immunity. The important disease entity where the role of the gut microbiota appears to be central is the Irritable Bowel Syndrome (IBS). IBS patients show evidence of immune activation, impaired gut barrier function and abnormal gut microbiota. Studies with probiotics have shown that these organisms can exert anti-inflammatory effects in inflammatory bowel disease and may strengthen the gut barrier in IBS of the diarrhoea-predominant type. Formal randomised trials of probiotics in IBS show mixed results with limited benefit for some but not all. Studies confirm that administered probiotics can survive and temporarily colonise the gut. They can also stimulate the numbers of other lactic acid bacilli in the gut, and reduce the numbers of pathogens. However consuming live organisms is not the only way to influence gut microbiota. Dietary prebiotics are selectively fermented ingredients that can change the composition and/or activity of the gastrointestinal microbiota in beneficial ways. Dietary components that reach the colon, and are available to influence the microbiota include poorly digestible carbohydrates, such as non-starch polysaccharides, resistant starch, non-digestible oligosaccharides (NDOs) and polyphenols. Mixtures of probiotic and prebiotic ingredients that can selectively stimulate growth or activity of health promoting bacteria have been termed ‘synbiotics’. All of these approaches can influence gut microbial ecology, mainly to increase bifidobacteria and lactobacilli, but metagenomic approaches may reveal wider effects. Characterising how these changes produce physiological benefits may enable broader use of these tactics in health and disease in the future. The current status of probiotic products commercially available worldwide is less than ideal. Prevalent problems include misidentification of ingredient organisms and poor viability of probiotic microorganisms leading to inadequate shelf life. On occasions these problems mean that some commercially available products cannot be considered to meet the definition of a probiotic product. Given the potential benefits of manipulating the human microbiota for beneficial effects, there is a clear need for improved regulation of probiotics. The potential importance of the human microbiota cannot be overstated. ‘We feed our microbes, they talk to us and we benefit. We just have to understand and then exploit this.’ (Willem de Vos).
Resumo:
Purpose – The Bodleian Binders Book contains nearly 150 pages of seventeenth century library records, revealing information about the binders used by the library and the thousands of bindings they produced. The purpose of this paper is to explore a pilot project to survey and record bindings information contained in the Binders Book. Design/methodology/approach – A sample size of seven pages (91 works, 65 identifiable bindings) to develop a methodology for surveying and recording bindings listed in the manuscript. To create a successful product that would be useful to bindings researchers, it addressed questions of bindings terminology and the role of the library in the knowledge creation process within the context that text encoding is changing the landscape of library functions. Text encoding formats were examined, and a basic TEI (Text Encoding Initiative) transcription was produced. This facilitates tagging of names and titles and the display of transcriptions with text images. Findings – Encoding was found not only to make the manuscript content more accessible, but to allow for the construction of new knowledge: characteristic Oxford binding traits were revealed and bindings were matched to binders. Plans for added functionality were formed. Originality/value – This research presents a “big picture” analysis of Oxford bindings as a result of text encoding and the foundation for qualitative and statistical analysis. It exemplifies the benefits of interdisciplinary methods – in this case from Digital Humanities – to enhance access to and interpretation of specialist materials and the library's provenance record.
Resumo:
This paper demonstrates the impracticality of a comprehensive mathematical definition of the term `drought' which formalises the general qualitative definition that drought is `a deficit of water relative to normal conditions'. Starting from the local water balance, it is shown that a universal description of drought requires reference to water supply, demand and management. The influence of human intervention through water management is shown to be intrinsic to the definition of drought in the universal sense and can only be eliminated in the case of purely meteorological drought. The state of `drought' is shown to be predicated on the existence of climatological norms for a multitude of process specific terms. In general these norms are either difficult to obtain or even non-existent in the non-stationary context of climate change. Such climatological considerations, in conjunction with the difficulty of quantifying human influence, lead to the conclusion that we cannot reasonably expect the existence of any workable generalised objective definition of drought.
Resumo:
Flood simulation models and hazard maps are only as good as the underlying data against which they are calibrated and tested. However, extreme flood events are by definition rare, so the observational data of flood inundation extent are limited in both quality and quantity. The relative importance of these observational uncertainties has increased now that computing power and accurate lidar scans make it possible to run high-resolution 2D models to simulate floods in urban areas. However, the value of these simulations is limited by the uncertainty in the true extent of the flood. This paper addresses that challenge by analyzing a point dataset of maximum water extent from a flood event on the River Eden at Carlisle, United Kingdom, in January 2005. The observation dataset is based on a collection of wrack and water marks from two postevent surveys. A smoothing algorithm for identifying, quantifying, and reducing localized inconsistencies in the dataset is proposed and evaluated showing positive results. The proposed smoothing algorithm can be applied in order to improve flood inundation modeling assessment and the determination of risk zones on the floodplain.
Resumo:
This chapter explores the distinctive qualities of the Matt Smith era Doctor Who, focusing on how dramatic emphases are connected with emphases on visual style, and how this depends on the programme's production methods and technologies. Doctor Who was first made in the 1960s era of live, studio-based, multi-camera television with monochrome pictures. However, as technical innovations like colour filming, stereo sound, CGI and post-production effects technology have been routinely introduced into the programme, and now High Definition (HD) cameras, they have given Doctor Who’s creators new ways of making visually distinctive narratives. Indeed, it has been argued that since the 1980s television drama has become increasingly like cinema in its production methods and aesthetic aims. Viewers’ ability to view the programme on high-specification TV sets, and to record and repeat episodes using digital media, also encourage attention to visual style in television as much as in cinema. The chapter evaluates how these new circumstances affect what Doctor Who has become and engages with arguments that visual style has been allowed to override characterisation and story in the current Doctor Who. The chapter refers to specific episodes, and frames the analysis with reference to earlier years in Doctor Who’s long history. For example, visual spectacle using green-screen and CGI can function as a set-piece (at the opening or ending of an episode) but can also work ‘invisibly’ to render a setting realistically. Shooting on location using HD cameras provides a rich and detailed image texture, but also highlights mistakes and especially problems of lighting. The reduction of Doctor Who’s budget has led to Steven Moffat’s episodes relying less on visual extravagance, connecting back both to Russell T. Davies’s concern to show off the BBC’s investment in the series but also to reference British traditions of gritty and intimate social drama. Pressures to capitalise on Doctor Who as a branded product are the final aspect of the chapter’s analysis, where the role of Moffat as ‘showrunner’ links him to an American (not British) style of television production where the preservation of format and brand values give him unusual power over the look of the series.
Resumo:
Purpose – The purpose of this paper is to investigate if CSR is balanced between firm and wider society interests. Design/methodology/approach – A qualitative interpretive hermeneutic approach is used to analyse a variety of publically published secondary sources on the CSR of Tesco, Sainsburys, Morrisons and Co-operative in the UK grocery multiple sector (2005-2010). Findings – CSR strategic outcomes currently favour the firm more than society interests. A multilayered framework in the form of Social Responsibility of the Corporation (SRC) is designed and offered in support of balancing the business-society relationship more evenly. Research limitations/implications – This study is limited to firms originating from within the UK grocery multiple sector. Asda could not be included in the study as it does not publish CSR reports annually in the UK after becoming part of Walmart group. Practical implications – A framework for multi-level standardised definition of CSR in the form of SRC is offered. The inclusion of employees and members of the public on CSR/SRC boards is recommended to foster wider collaboration. The SRC framework promotes standardisation at global level while respecting diversity and firm heterogeneity at firm level. The findings may further contribute to GRI; UN Global Compact; WEF dialogues. Social implications – Recommendations are made to extend CSR board diversity for improved dialogue with communities. The SRC framework may be applied at global; national; industry and firm level. The framework can be applied internationally or locally. Future studies may offer quantitative attributes for balancing CSR/SRC. Originality/value – A globally unique and universally applicable framework for evaluating CSR activities is proposed. Future studies may extend the authors' framework to other industries, national environments or globally in the pursuit of balance between firm and society. Furthermore, firms may also adopt the framework to support CSR activities.
Resumo:
The IEEE 754 standard for oating-point arithmetic is widely used in computing. It is based on real arithmetic and is made total by adding both a positive and a negative infinity, a negative zero, and many Not-a-Number (NaN) states. The IEEE infinities are said to have the behaviour of limits. Transreal arithmetic is total. It also has a positive and a negative infinity but no negative zero, and it has a single, unordered number, nullity. We elucidate the transreal tangent and extend real limits to transreal limits. Arguing from this firm foundation, we maintain that there are three category errors in the IEEE 754 standard. Firstly the claim that IEEE infinities are limits of real arithmetic confuses limiting processes with arithmetic. Secondly a defence of IEEE negative zero confuses the limit of a function with the value of a function. Thirdly the definition of IEEE NaNs confuses undefined with unordered. Furthermore we prove that the tangent function, with the infinities given by geometrical con- struction, has a period of an entire rotation, not half a rotation as is commonly understood. This illustrates a category error, confusing the limit with the value of a function, in an important area of applied mathe- matics { trigonometry. We brie y consider the wider implications of this category error. Another paper proposes transreal arithmetic as a basis for floating- point arithmetic; here we take the profound step of proposing transreal arithmetic as a replacement for real arithmetic to remove the possibility of certain category errors in mathematics. Thus we propose both theo- retical and practical advantages of transmathematics. In particular we argue that implementing transreal analysis in trans- floating-point arith- metic would extend the coverage, accuracy and reliability of almost all computer programs that exploit real analysis { essentially all programs in science and engineering and many in finance, medicine and other socially beneficial applications.
Resumo:
There are three key components for developing a metadata system: a container structure laying out the key semantic issues of interest and their relationships; an extensible controlled vocabulary providing possible content; and tools to create and manipulate that content. While metadata systems must allow users to enter their own information, the use of a controlled vocabulary both imposes consistency of definition and ensures comparability of the objects described. Here we describe the controlled vocabulary (CV) and metadata creation tool built by the METAFOR project for use in the context of describing the climate models, simulations and experiments of the fifth Coupled Model Intercomparison Project (CMIP5). The CV and resulting tool chain introduced here is designed for extensibility and reuse and should find applicability in many more projects.
Resumo:
Some amendments are proposed to a recent redefinition of the mental model concept in system dynamics. First, externalised, or articulated mental models should not be called cognitive maps; this term has a well established, alternative meaning. Second, there can be mental models of entities not yet existing beyond an individual's mind; the modelling of planned or desired systems is possible and recommended. Third, saying that mental models maintain social systems connects with some exciting research opportunities for system dynamics; however, it is probably an accidental distraction from the intended meaning of the redefinition. These minor criticisms apart, the new definition of mental model of a dynamic system is welcomed as a useful contribution to both research and practice.
Resumo:
A framework for understanding the complexity of cancer development was established by Hanahan and Weinberg in their definition of the hallmarks of cancer. In this review, we consider the evidence that parabens can enable development in human breast epithelial cells of 4/6 of the basic hallmarks, 1/2 of the emerging hallmarks and 1/2 of the enabling characteristics. Hallmark 1: parabens have been measured as present in 99% of human breast tissue samples, possess oestrogenic activity and can stimulate sustained proliferation of human breast cancer cells at concentrations measurable in the breast. Hallmark 2: parabens can inhibit the suppression of breast cancer cell growth by hydroxytamoxifen, and through binding to the oestrogen-related receptor gamma (ERR) may prevent its deactivation by growth inhibitors. Hallmark 3: in the 10nM to 1M range, parabens give a dose-dependent evasion of apoptosis in high-risk donor breast epithelial cells. Hallmark 4: long-term exposure (>20weeks) to parabens leads to increased migratory and invasive activity in human breast cancer cells, properties which are linked to the metastatic process. Emerging hallmark: methylparaben has been shown in human breast epithelial cells to increase mTOR, a key regulator of energy metabolism. Enabling characteristic: parabens can cause DNA damage at high concentrations in the short term but more work is needed to investigate long-term low-doses of mixtures. The ability of parabens to enable multiple cancer hallmarks in human breast epithelial cells provides grounds for regulatory review of the implications of the presence of parabens in human breast tissue.