109 resultados para sets of words
Resumo:
In 2003 the European Commission started using Impact Assessment (IA) as the main empirical basis for its major policy proposals. The aim was to systematically assess ex ante the economic, social and environmental impacts of EU policy proposals. In parallel, research proliferated in search for theoretical grounds for IAs and in an attempt to evaluate empirically the performance of the first sets of IAs produced by the European Commission. This paper combines conceptual and evaluative studies carried out in the first five years of EU IAs. It concludes that the great discrepancy between rationale and practice calls for a different theoretical focus and a higher emphasis on evaluating empirically crucial risk economics aspects of IAs, such as the value of statistical life, price of carbon, the integration of macroeconomic modelling and scenario analysis.
Resumo:
Hocaoglu MB, Gaffan EA, Ho AK. The Huntington's disease health-related quality of life questionnaire: a disease-specific measure of health-related quality of life. Huntington's disease (HD) is a genetic neurodegenerative disorder characterized by motor, cognitive and psychiatric disturbances, and yet there is no disease-specific patient-reported health-related quality of life outcome measure for patients. Our aim was to develop and validate such an instrument, i.e. the Huntington's Disease health-related Quality of Life questionnaire (HDQoL), to capture the true impact of living with this disease. Semi-structured interviews were conducted with the full spectrum of people living with HD, to form a pool of items, which were then examined in a larger sample prior to data-driven item reduction. We provide the statistical basis for the extraction of three different sets of scales from the HDQoL, and present validation and psychometric data on these scales using a sample of 152 participants living with HD. These new patient-derived scales provide promising patient-reported outcome measures for HD.
Resumo:
Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.
Resumo:
[1] Remotely sensed, multiannual data sets of shortwave radiative surface fluxes are now available for assimilation into land surface schemes (LSSs) of climate and/or numerical weather prediction models. The RAMI4PILPS suite of virtual experiments assesses the accuracy and consistency of the radiative transfer formulations that provide the magnitudes of absorbed, reflected, and transmitted shortwave radiative fluxes in LSSs. RAMI4PILPS evaluates models under perfectly controlled experimental conditions in order to eliminate uncertainties arising from an incomplete or erroneous knowledge of the structural, spectral and illumination related canopy characteristics typical for model comparison with in situ observations. More specifically, the shortwave radiation is separated into a visible and near-infrared spectral region, and the quality of the simulated radiative fluxes is evaluated by direct comparison with a 3-D Monte Carlo reference model identified during the third phase of the Radiation transfer Model Intercomparison (RAMI) exercise. The RAMI4PILPS setup thus allows to focus in particular on the numerical accuracy of shortwave radiative transfer formulations and to pinpoint to areas where future model improvements should concentrate. The impact of increasing degrees of structural and spectral subgrid variability on the simulated fluxes is documented and the relevance of any thus emerging biases with respect to gross primary production estimates and shortwave radiative forcings due to snow and fire events are investigated.
Resumo:
This study compares two sets of measurements of the composition of bulk precipitation and throughfall at a site in southern England with a 20-year gap between them. During this time, SO2 emissions from the UK fell by 82%, NOx emissions by 35% and NH3 emissions by 7%. These reductions were partly reflected in bulk precipitation, with deposition reductions of 56% in SO4,38% in NO3, 32% in NH4, and 73% in H+. In throughfall under Scots pine, the effects were more dramatic, with an 89% reduction in SO4 deposition and a 98% reduction in H+ deposition. The mean pH under these trees increased from 2.85 to 4.30. Nitrate and ammonium deposition in throughfall increased slightly, however. In the earlier period, the Scots pines were unable to neutralise the high flux of acidity associated with sulphur deposition, even though this was not a highly polluted part of the UK, and deciduous trees (oak and birch) were only able to neutralise it in summer when the leaves were present. In the later period, the sulphur flux had reduced to the point where the acidity could be neutralised by all species — the neutralisation mechanism is thus likely to be largely leaching of base cations and buffering substances from the foliage. The high fluxes are partly due to the fact that these are 60–80 year old trees growing in an open forest structure. The increase in NO3 and NH4 in throughfall in spite of decreased deposition seems likely due to a decrease in foliar uptake, perhaps due to the increasing nitrogen saturation of the catchment soils. These changes may increase the rate of soil microbial activity as nitrogen increases and acidity declines, with consequent effects on water quality of the catchment drainage stream.
Resumo:
Reductions in the division of labour are a significant feature of modern developments in work organisation. It has been recognised that a reduced division of labour can have the advantages of job enrichment and lower coordination costs. In this paper it is shown how advantages from a lesser division of labour can stem from the flow of work between different sets of resources where the work rates of individual production stages are subject to uncertainties. Both process and project-based work are considered. Implications for the boundaries of the firm and for innovation processes are noted.
Resumo:
1. Nutrient concentrations (particularly N and P) determine the extent to which water bodies are or may become eutrophic. Direct determination of nutrient content on a wide scale is labour intensive but the main sources of N and P are well known. This paper describes and tests an export coefficient model for prediction of total N and total P from: (i) land use, stock headage and human population; (ii) the export rates of N and P from these sources; and (iii) the river discharge. Such a model might be used to forecast the effects of changes in land use in the future and to hindcast past water quality to establish comparative or baseline states for the monitoring of change. 2. The model has been calibrated against observed data for 1988 and validated against sets of observed data for a sequence of earlier years in ten British catchments varying from uplands through rolling, fertile lowlands to the flat topography of East Anglia. 3. The model predicted total N and total P concentrations with high precision (95% of the variance in observed data explained). It has been used in two forms: the first on a specific catchment basis; the second for a larger natural region which contains the catchment with the assumption that all catchments within that region will be similar. Both models gave similar results with little loss of precision in the latter case. This implies that it will be possible to describe the overall pattern of nutrient export in the UK with only a fraction of the effort needed to carry out the calculations for each individual water body. 4. Comparison between land use, stock headage, population numbers and nutrient export for the ten catchments in the pre-war year of 1931, and for 1970 and 1988 show that there has been a substantial loss of rough grazing to fertilized temporary and permanent grasslands, an increase in the hectarage devoted to arable, consistent increases in the stocking of cattle and sheep and a marked movement of humans to these rural catchments. 5. All of these trends have increased the flows of nutrients with more than a doubling of both total N and total P loads during the period. On average in these rural catchments, stock wastes have been the greatest contributors to both N and P exports, with cultivation the next most important source of N and people of P. Ratios of N to P were high in 1931 and remain little changed so that, in these catchments, phosphorus continues to be the nutrient most likely to control algal crops in standing waters supplied by the rivers studied.
Resumo:
In 'Avalanche', an object is lowered, players staying in contact throughout. Normally the task is easily accomplished. However, with larger groups counter-intuitive behaviours appear. The paper proposes a formal theory for the underlying causal mechanisms. The aim is to not only provide an explicit, testable hypothesis for the source of the observed modes of behaviour-but also to exemplify the contribution that formal theory building can make to understanding complex social phenomena. Mapping reveals the importance of geometry to the Avalanche game; each player has a pair of balancing loops, one involved in lowering the object, the other ensuring contact. For more players, sets of balancing loops interact and these can allow dominance by reinforcing loops, causing the system to chase upwards towards an ever-increasing goal. However, a series of other effects concerning human physiology and behaviour (HPB) is posited as playing a role. The hypothesis is therefore rigorously tested using simulation. For simplicity a 'One Degree of Freedom' case is examined, allowing all of the effects to be included whilst rendering the analysis more transparent. Formulation and experimentation with the model gives insight into the behaviours. Multi-dimensional rate/level analysis indicates that there is only a narrow region in which the system is able to move downwards. Model runs reproduce the single 'desired' mode of behaviour and all three of the observed 'problematic' ones. Sensitivity analysis gives further insight into the system's modes and their causes. Behaviour is seen to arise only when the geometric effects apply (number of players greater than degrees of freedom of object) in combination with a range of HPB effects. An analogy exists between the co-operative behaviour required here and various examples: conflicting strategic objectives in organizations; Prisoners' Dilemma and integrated bargaining situations. Additionally, the game may be relatable in more direct algebraic terms to situations involving companies in which the resulting behaviours are mediated by market regulations. Finally, comment is offered on the inadequacy of some forms of theory building and the case is made for formal theory building involving the use of models, analysis and plausible explanations to create deep understanding of social phenomena.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.
Resumo:
The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform data mining and other analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data that is used to populate the second component, and a data warehouse that contains important molecular properties. These properties may be used for data mining studies. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular, we look at two aspects: firstly, how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories — this is an important and challenging aspect of P-found, due to the large data volumes involved and the desire of scientists to maintain control of their own data. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling scientific discovery.
Resumo:
There have been two kinds of study of ancient beliefs in the earlier prehistory of Scandinavia. One considers the impact of ideas which originated further to the south and east. It considers a cosmology based on the movements of the sun. A second tradition develops out of the ethnography of the circumpolar region and combines archaeological evidence with the beliefs of hunter-gatherers. It postulates the existence of a three-tier cosmology in which people could communicate between different worlds. This paper argues that certain elements that are thought to epitomize the ‘Southern’ system might have been suggested by existing ideas within Scandinavia itself. Both sets of beliefs came to influence one another, but they became increasingly distinct towards the end of the Bronze Age. This paper reconsiders the rock carvings, metalwork and mortuary cairns of that period and the Iron Age in relation to the process of religious change.
Resumo:
It has long been supposed that preference judgments between sets of to-be-considered possibilities are made by means of initially winnowing down the most promising-looking alternatives to form smaller “consideration sets” (Howard, 1963; Wright & Barbour, 1977). In preference choices with >2 options, it is standard to assume that a “consideration set”, based upon some simple criterion, is established to reduce the options available. Inferential judgments, in contrast, have more frequently been investigated in situations in which only two possibilities need to be considered (e.g., which of these two cities is the larger?) Proponents of the “fast and frugal” approach to decision-making suggest that such judgments are also made on the basis of limited, simple criteria. For example, if only one of two cities is recognized and the task is to judge which city has the larger population, the recognition heuristic states that the recognized city should be selected. A multinomial processing tree model is outlined which provides the basis for estimating the extent to which recognition is used as a criterion in establishing a consideration set for inferential judgments between three possible options.
Resumo:
This conference was an unusual and interesting event. Celebrating 25 years of Construction Management and Economics provides us with an opportunity to reflect on the research that has been reported over the years, to consider where we are now, and to think about the future of academic research in this area. Hence the sub-title of this conference: “past, present and future”. Looking through these papers, some things are clear. First, the range of topics considered interesting has expanded hugely since the journal was first published. Second, the research methods are also more diverse. Third, the involvement of wider groups of stakeholder is evident. There is a danger that this might lead to dilution of the field. But my instinct has always been to argue against the notion that Construction Management and Economics represents a discipline, as such. Granted, there are plenty of university departments around the world that would justify the idea of a discipline. But the vast majority of academic departments who contribute to the life of this journal carry different names to this. Indeed, the range and breadth of methodological approaches to the research reported in Construction Management and Economics indicates that there are several different academic disciplines being brought to bear on the construction sector. Some papers are based on economics, some on psychology and others on operational research, sociology, law, statistics, information technology, and so on. This is why I maintain that construction management is not an academic discipline, but a field of study to which a range of academic disciplines are applied. This may be why it is so interesting to be involved in this journal. The problems to which the papers are applied develop and grow. But the broad topics of the earliest papers in the journal are still relevant today. What has changed a lot is our interpretation of the problems that confront the construction sector all over the world, and the methodological approaches to resolving them. There is a constant difficulty in dealing with topics as inherently practical as these. While the demands of the academic world are driven by the need for the rigorous application of sound methods, the demands of the practical world are quite different. It can be difficult to meet the needs of both sets of stakeholders at the same time. However, increasing numbers of postgraduate courses in our area result in larger numbers of practitioners with a deeper appreciation of what research is all about, and how to interpret and apply the lessons from research. It also seems that there are contributions coming not just from construction-related university departments, but also from departments with identifiable methodological traditions of their own. I like to think that our authors can publish in journals beyond the construction-related areas, to disseminate their theoretical insights into other disciplines, and to contribute to the strength of this journal by citing our articles in more mono-disciplinary journals. This would contribute to the future of the journal in a very strong and developmental way. The greatest danger we face is in excessive self-citation, i.e. referring only to sources within the CM&E literature or, worse, referring only to other articles in the same journal. The only way to ensure a strong and influential position for journals and university departments like ours is to be sure that our work is informing other academic disciplines. This is what I would see as the future, our logical next step. If, as a community of researchers, we are not producing papers that challenge and inform the fundamentals of research methods and analytical processes, then no matter how practically relevant our output is to the industry, it will remain derivative and secondary, based on the methodological insights of others. The balancing act between methodological rigour and practical relevance is a difficult one, but not, of course, a balance that has to be struck in every single paper.
Resumo:
Fiona Ross, Tim Holloway (co-designers) were commissioned by the renowned newspaper and publishing house Anandabazar Patrika (ABP) to design a new low-contrast typeface in a contemporary style for print and screen use in its publications. Ross and Holloway designed ABP's Bengali house typeface (Linotype Bengali - the first digital Bengali font) that has been in daily use in its newspaper since 1982. The design team was augmented by Neelakash Kshetrimayum; OpenType production undertaken by John Hudson. This Bengali typeface is the first fully functional OpenType design for the script. It demonstrates innovative features that resolve problems which hitherto hindered the successful execution of low-contrast Bengali text fonts: this connecting script of over 450 characters has deep verticals, spiralling strokes, wide characters, and intersecting ascenders. The new design has solutions to overcome the necessity to implement wide interlinear spacing and sets more words to the line than has yet been possible. This project therefore combines the use of aesthetic, technical and linguistic skills and is highly visible in newspapers of the largest newspaper group and publishing house in West Bengal in print and on-line. The design and development of Sarkar has positive implications for other non-Latin script designs, just as the Linotype Bengali typeface formed the blueprint for new non-Latin designs three decades ago. Sarkar was released on 31 August 2012 with the launch of Anandabazar Patirka's new newspaper Ebela.
Resumo:
A version of the Canadian Middle Atmosphere Model that is coupled to an ocean is used to investigate the separate effects of climate change and ozone depletion on the dynamics of the Southern Hemisphere (SH) stratosphere. This is achieved by performing three sets of simulations extending from 1960 to 2099: 1) greenhouse gases (GHGs) fixed at 1960 levels and ozone depleting substances (ODSs) varying in time, 2) ODSs fixed at 1960 levels and GHGs varying in time, and 3) both GHGs and ODSs varying in time. The response of various dynamical quantities to theGHGand ODS forcings is shown to be additive; that is, trends computed from the sum of the first two simulations are equal to trends from the third. Additivity is shown to hold for the zonal mean zonal wind and temperature, the mass flux into and out of the stratosphere, and the latitudinally averaged wave drag in SH spring and summer, as well as for final warming dates. Ozone depletion and recovery causes seasonal changes in lower-stratosphere mass flux, with reduced polar downwelling in the past followed by increased downwelling in the future in SH spring, and the reverse in SH summer. These seasonal changes are attributed to changes in wave drag caused by ozone-induced changes in the zonal mean zonal winds. Climate change, on the other hand, causes a steady decrease in wave drag during SH spring, which delays the breakdown of the vortex, resulting in increased wave drag in summer