865 resultados para Definition of cuisine
Resumo:
Statutory monitoring of the fauna of the ‘mudflats and sandflats not covered by seawater at low tide’ biotope complex on St Martin’s Flats, a part of the Isles of Scilly Complex Special Area of Conservation, was undertaken in 2000, 2004 and 2009. The targets set by Natural England for “characteristic biotopes” were that “composite species, abundance and diversity should not deviate significantly from an established baseline, subject to natural change”. The three specified biotopes could not be distinguished, and instead three assemblages were subjectively defined based on sediment surface features. There were statistically significant natural changes in diversity and species composition between years, especially in the association initially characterized by the razor-clam Ensis, and possible reasons for this are discussed. It is suggested that setting fixed local limits on natural variability is almost always impractical. Two possible approaches to distinguishing between natural and anthropogenic changes are suggested; a change in ecological condition as indicated by AMBI scores, and a significant change in average taxonomic distinctness (Δ+) compared with expectation. The determination of species biomasses as well as abundances might also open more possibilities for assessment. The practice of setting objectives for a marine SAC feature that include the range and number of biotopes cannot be supported, in view the difficulty in ascribing assemblages to recognised biotopes. A more realistic definition of species assemblages might best be gained from examination of the species that consistently make a substantial contribution to the Bray Curtis similarity among samples collected from specific sites.
Resumo:
A method for simulation of acoustical bores, useful in the context of sound synthesis by physical modeling of woodwind instruments, is presented. As with previously developed methods, such as digital waveguide modeling (DWM) [Smith, Comput. Music J. 16, pp 74-91 (1992)] and the multi convolution algorithm (MCA) [Martinez et al., J. Acoust. Soc. Am. 84, pp 1620-1627 (1988)], the approach is based on a one-dimensional model of wave propagation in the bore. Both the DWM method and the MCA explicitly compute the transmission and reflection of wave variables that represent actual traveling pressure waves. The method presented in this report, the wave digital modeling (WDM) method, avoids the typical limitations associated with these methods by using a more general definition of the wave variables. An efficient and spatially modular discrete-time model is constructed from the digital representations of elemental bore units such as cylindrical sections, conical sections, and toneholes. Frequency-dependent phenomena, such as boundary losses, are approximated with digital filters. The stability of a simulation of a complete acoustic bore is investigated empirically. Results of the simulation of a full clarinet show that a very good concordance with classic transmission-line theory is obtained.
Resumo:
Quality of life is becoming recognized increasingly as an important outcome measure which needs to be considered by social workers. However, there does not appear to be a clear consensus about the definition of quality of life. In addition, social workers are likely to experience difficulties choosing and applying an appropriate instrument with which to measure quality of life because of the many available instruments purporting to assess quality of life. This paper discusses the definition of health-related quality of life and explains the main measurement properties of an instrument that must be appraised when considering whether or not an instrument is appropriate. The paper will assist social workers to make an informed choice about measures of health-related quality of life.
Resumo:
Increasing emphasis is being placed on the evaluation of health-related quality of life. However, there is no consensus on the definition of this concept and as a result there are a plethora of existing measurement instruments. Head-to-head comparisons of the psychometric properties of existing instruments are necessary to facilitate evidence-based decisions about which instrument should be chosen for routine use. Therefore, an individualised instrument (the modified Patient Generated Index), a generic instrument (the Short Form 36) and a disease-specific instrument (the Quality of Life after Myocardial Infarction questionnaire) were administered to patients with ischaemic heart disease (n=117) and the evidence for the validity, reliability and sensitivity of each instrument was examined and compared. The modified Patient Generated Index compared favourably with the other instruments but none of the instruments examined provided sound evidence for sensitivity to change. Therefore, any recommendation for the use of the individualised approach in the routine collection of health-related quality of life data in clinical practice must be conditional upon the submission of further evidence to support the sensitivity of such instruments.
Resumo:
This project investigates the English-language life writing of diasporic Iranian Jewish women. It examines how these women have differentially imagined their diasporic lives and travels, and how they have in turn been imagined and accepted or rejected by their audiences. In the first chapter, I use “home” as a lens for understanding three distinct life writing texts, showing how the authors write about what it means to have a home and to be at home in contrasting and even contradictory ways. I show how, despite potential hegemonic readings that perpetuate unequal relationships and a normative definition of the ideal home, the texts are open to multiple contestatory readings that create spaces for new formulations and understandings. In the second chapter, I look more closely at the intersections between trauma stories and the life writing of Iranian Jewish women, and I argue that readers use life writing texts about trauma to support an egocentric reconstruction of American democracy and dominance. I also show how a critical frame for understanding trauma can yield interpretations that highlight, rather than ignore, relationships of power and privilege. In the final chapter of the thesis, I present a case study of two online reading groups, and I show that communal reading environments, though they participate in dominant discourses, are also spaces where resistance and subversion can develop.
Resumo:
This article examines Pierre Bourdieu's sociology of the economy and his more recent politically engaged interventions on 'globalisation'. Many scholars regard these as not being in the same academic league as his classic studies on taste, academia, and state elites, etc., and, instead, dismiss them as a private matter or even, as the spleen of Pierre Bourdieu, the individual. This paper questions this disjunction of the 'academic' and 'politically engaged' sides of Pierre Bourdieu's work. First, it argues that his most recent interventions against a neo-liberal globalisation were the logical result of a particular definition of intellectual practice that had been outlined before in his sociology of the intellectual field. It then demonstrates that Bourdieu's economic sociology and critique of contemporary capitalism not only does not contradict his earlier research, but that it provides valuable and original insights into the current transformation of the political economy of the advanced capitalist countries. The paper concludes with a suggestion of how to strengthen the theoretical foundation of Bourdieu's analysis of contemporary capitalism by relating it to and making it compatible with alternative approaches in the tradition of critical political economy.
Resumo:
Background and purpose: Radiotherapy is widely used to palliate local symptoms in non-small-cell lung cancer. Using conventional X-ray simulation, it is often difficult to accurately localize the extent of the tumour. We report a randomized, double blind trial comparing target localization with conventional and virtual simulation.Methods: Eighty-six patients underwent both conventional and virtual simulation. The conventional simulator films were compared with digitally reconstructed radiographs (DRRs) produced from the computed tomography (CT) data. The treatment fields defined by the clinicians using each modality were compared in terms of field area, position and the implications for target coverage.Results: Comparing fields defined by each study arm, there was a major mis-match in coverage between fields in 66.2% of cases, and a complete match in only 5.2% of cases. In 82.4% of cases, conventional simulator fields were larger (mean 24.5+/-5.1% (95% confidence interval)) than CT-localized fields, potentially contributing to a mean target under-coverage of 16.4+/-3.5% and normal tissue over-coverage of 25.4+/-4.2%.Conclusions: CT localization and virtual simulation allow more accurate definition of the target volume. This could enable a reduction in geographical misses, while also reducing treatment-related toxicity.
Resumo:
Genome-scale metabolic models promise important insights into cell function. However, the definition of pathways and functional network modules within these models, and in the biochemical literature in general, is often based on intuitive reasoning. Although mathematical methods have been proposed to identify modules, which are defined as groups of reactions with correlated fluxes, there is a need for experimental verification. We show here that multivariate statistical analysis of the NMR-derived intra- and extracellular metabolite profiles of single-gene deletion mutants in specific metabolic pathways in the yeast Saccharomyces cerevisiae identified outliers whose profiles were markedly different from those of the other mutants in their respective pathways. Application of flux coupling analysis to a metabolic model of this yeast showed that the deleted gene in an outlying mutant encoded an enzyme that was not part of the same functional network module as the other enzymes in the pathway. We suggest that metabolomic methods such as this, which do not require any knowledge of how a gene deletion might perturb the metabolic network, provide an empirical method for validating and ultimately refining the predicted network structure.
Resumo:
This article aims to reassess F. Scott Fitzgerald’s classic The Great Gatsby (1925), taking into consideration the myth-critical hypotheses of philosopher René Girard. Specifically, this essay will analyse the concepts of mimetic desire, resentment and reprisal violence as emotional components of myth, paying close attention to how the reinterpreted mythical pattern of the novel influences the depiction of such emotions as social traits of corruption. Finally, this article will challenge interpretations that have regarded Gatsby as a successful scapegoat-figure, examining instead how the mythical meanings and structures of the text stage an emotional crisis of frustrated desire and antagonism that ultimately offers no hope of communal restoration.
Resumo:
This study highlights how heuristic evaluation as a usability evaluation method can feed into current building design practice to conform to universal design principles. It provides a definition of universal usability that is applicable to an architectural design context. It takes the seven universal design principles as a set of heuristics and applies an iterative sequence of heuristic evaluation in a shopping mall, aiming to achieve a cost-effective evaluation process. The evaluation was composed of three consecutive sessions. First, five evaluators from different professions were interviewed regarding the construction drawings in terms of universal design principles. Then, each evaluator was asked to perform the predefined task scenarios. In subsequent interviews, the evaluators were asked to re-analyze the construction drawings. The results showed that heuristic evaluation could successfully integrate universal usability into current building design practice in two ways: (i) it promoted an iterative evaluation process combined with multi-sessions rather than relying on one evaluator and on one evaluation session to find the maximum number of usability problems, and (ii) it highlighted the necessity of an interdisciplinary ad hoc committee regarding the heuristic abilities of each profession. A multi-session and interdisciplinary heuristic evaluation method can save both the project budget and the required time, while ensuring a reduced error rate for the universal usage of the built environments.
Resumo:
Purpose – The purpose of this paper is to present an analysis of media representation of business ethics within 62 international newspapers to explore the longitudinal and contextual evolution of business ethics and associated terminology. Levels of coverage and contextual analysis of the content of the articles are used as surrogate measures of the penetration of business ethics concepts into society. Design/methodology/approach – This paper uses a text mining application based on two samples of data: analysis of 62 national newspapers in 21 countries from 1990 to 2008; analysis of the content of two samples of articles containing the term business ethics (comprised of 100 newspaper articles spread over an 18-year period from a sample of US and UK newspapers). Findings – The paper demonstrates increased coverage of sustainability topics within the media over the last 18 years associated with events such as the Rio Summit. Whilst some peaks are associated with business ethics scandals, the overall coverage remains steady. There is little apparent use in the media of concepts such as corporate citizenship. The academic community and company ethical codes appear to adopt a wider definition of business ethics more akin to that associated with sustainability, in comparison with the focus taken by the media, especially in the USA. Coverage demonstrates clear regional bias and contextual analysis of the articles in the UK and USA also shows interesting parallels and divergences in the media representation of business ethics. Originality/value – A promising avenue to explore how the evolution of sustainability issues including business ethics can be tracked within a societal context.
Resumo:
The following study considers the fragmentation of law which occurred in 1956 with regard to the law on servitude. As States were unwilling to go as far as the Universal Declaration on Human in establishing that "no one shall be held in [...] servitude", the negotiators of the 1956 Supplementary Conventions moved to expunge the very term 'servitude' from the text and to replace it with the phrase 'institutions and practices similar to slavery' which could then be abolished 'progressively and as soon as possible'. The negotiation history of the 1956 Convention clear demonstrate that the Universal Declaration on Human was the elephant in the room and that it ultimately lead to a fragmentation of the law as between general international law manifest in the 1956 Supplementary Convention on the one hand and international human rights law on the other. It is for this reason that, for instance the 2001 UN and 2005 Council of Europe trafficking conventions mention both 'practices similar to slavery' and 'servitude' as types of human exploitation to be suppressed in their definition of 'trafficking in persons'.
Resumo:
The introduction of functional data into the radiotherapy treatment planning process is currently the focus of significant commercial, technical, scientific and clinical development. The potential of such data from positron emission tomography (PET) was recognized at an early stage and was integrated into the radiotherapy treatment planning process through the use of image fusion software. The combination of PET and CT in a single system (PET/CT) to form an inherently fused anatomical and functional dataset has provided an imaging modality which could be used as the prime tool in the delineation of tumour volumes and the preparation of patient treatment plans, especially when integrated with virtual simulation. PET imaging typically using F-Fluorodeoxyglucose (F-FDG) can provide data on metabolically active tumour volumes. These functional data have the potential to modify treatment volumes and to guide treatment delivery to cells with particular metabolic characteristics. This paper reviews the current status of the integration of PET and PET/CT data into the radiotherapy treatment process. Consideration is given to the requirements of PET/CT data acquisition with reference to patient positioning aids and the limitations imposed by the PET/CT system. It also reviews the approaches being taken to the definition of functional/ tumour volumes and the mechanisms available to measure and include physiological motion into the imaging process. The use of PET data must be based upon a clear understanding of the interpretation and limitations of the functional signal. Protocols for the implementation of this development remain to be defined, and outcomes data based upon clinical trials are still awaited. © 2006 The British Institute of Radiology.
Resumo:
The concept of governance has been widely discussed in both the business and non-business sectors. The debate has also been entered into within the charity sector, which comprises over 169,000 organizations in the UK. The UK-based Charity Commission, which describes itself as existing to ‘promote sound governance and accountability’, has taken a lead in this debate by promoting greater regulation and producing numerous recommendations with regard to the proper governance of charitable organizations. However, the concept of what is meant by governance is unclear and a myriad of ideas are placed under the umbrella of ‘good governance’. This paper explores the major themes that form the basis of much of this discussion, examining both the theoretical underpinnings and empirical investigations relating to this area (looking from the perspective of the key stakeholders in the charity sector). Based on an analysis of the extant literature, this paper presents a broad definition of governance with respect to charities and outlines a future research agenda for those interested in adding to knowledge in this area.
Resumo:
Changes to software requirements occur during initial development and subsequent to delivery, posing a risk to cost and quality while at the same time providing an opportunity to add value. Provision of a generic change source taxonomy will support requirements change risk visibility, and also facilitate richer recording of both pre- and post-delivery change data. In this paper we present a collaborative study to investigate and classify sources of requirements change, drawing comparison between those pertaining to software development and maintenance. We begin by combining evolution, maintenance and software lifecycle research to derive a definition of software maintenance, which provides the foundation for empirical context and comparison. Previously published change ‘causes’ pertaining to development are elicited from the literature, consolidated using expert knowledge and classified using card sorting. A second study incorporating causes of requirements change during software maintenance results in a taxonomy which accounts for the entire evolutionary progress of applications software. We conclude that the distinction between the terms maintenance and development is imprecise, and that changes to requirements in both scenarios arise due to a combination of factors contributing to requirements uncertainty and events that trigger change. The change trigger taxonomy constructs were initially validated using a small set of requirements change data, and deemed sufficient and practical as a means to collect common requirements change statistics across multiple projects.