968 resultados para Nihon Joshi Daigaku.
Resumo:
This paper proposes a framework to support Customer Relationship Management (CRM) implementation in nursing homes. The work extends research by Cheng et al. (2005) who conducted in-depth questionnaires to identify critical features (termed value-characteristics), which are areas identified as adding the most value if implemented. Although Cheng et al. did proposed an implementation framework, summary of, and inconsistent inclusion of value-characteristics, limits the practical use of this contribution during implementation. In this paper we adapt the original framework to correct perceived deficiencies. We link the value characteristics to operational, analytical, strategic and/or collaborative CRM solution types, to allow consideration in context of practical implementation solutions. The outcome of this paper shows that, practically, a 'one solution meets all characteristic' approach to CRM implementation within nursing homes is inappropriate. Our framework, however, supports implementers in identifying how value can be gained when implementing a specific CRM solution within nursing homes; which subsequently support project management and expectation management.
Resumo:
Atmospheric turbulence causes most weather-related aircraft incidents1. Commercial aircraft encounter moderate-or-greater turbulence tens of thousands of times each year worldwide, injuring probably hundreds of passengers (occasionally fatally), costing airlines tens of millions of dollars and causing structural damage to planes1, 2, 3. Clear-air turbulence is especially difficult to avoid, because it cannot be seen by pilots or detected by satellites or on-board radar4, 5. Clear-air turbulence is linked to atmospheric jet streams6, 7, which are projected to be strengthened by anthropogenic climate change8. However, the response of clear-air turbulence to projected climate change has not previously been studied. Here we show using climate model simulations that clear-air turbulence changes significantly within the transatlantic flight corridor when the concentration of carbon dioxide in the atmosphere is doubled. At cruise altitudes within 50–75° N and 10–60° W in winter, most clear-air turbulence measures show a 10–40% increase in the median strength of turbulence and a 40–170% increase in the frequency of occurrence of moderate-or-greater turbulence. Our results suggest that climate change will lead to bumpier transatlantic flights by the middle of this century. Journey times may lengthen and fuel consumption and emissions may increase. Aviation is partly responsible for changing the climate9, but our findings show for the first time how climate change could affect aviation.
Resumo:
We describe here the development and evaluation of an Earth system model suitable for centennial-scale climate prediction. The principal new components added to the physical climate model are the terrestrial and ocean ecosystems and gas-phase tropospheric chemistry, along with their coupled interactions. The individual Earth system components are described briefly and the relevant interactions between the components are explained. Because the multiple interactions could lead to unstable feedbacks, we go through a careful process of model spin up to ensure that all components are stable and the interactions balanced. This spun-up configuration is evaluated against observed data for the Earth system components and is generally found to perform very satisfactorily. The reason for the evaluation phase is that the model is to be used for the core climate simulations carried out by the Met Office Hadley Centre for the Coupled Model Intercomparison Project (CMIP5), so it is essential that addition of the extra complexity does not detract substantially from its climate performance. Localised changes in some specific meteorological variables can be identified, but the impacts on the overall simulation of present day climate are slight. This model is proving valuable both for climate predictions, and for investigating the strengths of biogeochemical feedbacks.
Resumo:
This study investigates the potential contribution of observed changes in lower stratospheric water vapour to stratospheric temperature variations over the past three decades using a comprehensive global climate model (GCM). Three case studies are considered. In the first, the net increase in stratospheric water vapour (SWV) from 1980–2010 (derived from the Boulder frost-point hygrometer record using the gross assumption that this is globally representative) is estimated to have cooled the lower stratosphere by up to ∼0.2 K decade−1 in the global and annual mean; this is ∼40% of the observed cooling trend over this period. In the Arctic winter stratosphere there is a dynamical response to the increase in SWV, with enhanced polar cooling of 0.6 K decade−1 at 50 hPa and warming of 0.5 K decade−1 at 1 hPa. In the second case study, the observed decrease in tropical lower stratospheric water vapour after the year 2000 (imposed in the GCM as a simplified representation of the observed changes derived from satellite data) is estimated to have caused a relative increase in tropical lower stratospheric temperatures by ∼0.3 K at 50 hPa. In the third case study, the wintertime dehydration in the Antarctic stratospheric polar vortex (again using a simplified representation of the changes seen in a satellite dataset) is estimated to cause a relative warming of the Southern Hemisphere polar stratosphere by up to 1 K at 100 hPa from July–October. This is accompanied by a weakening of the westerly winds on the poleward flank of the stratospheric jet by up to 1.5 m s−1 in the GCM. The results show that, if the measurements are representative of global variations, SWV should be considered as important a driver of transient and long-term variations in lower stratospheric temperature over the past 30 years as increases in long-lived greenhouse gases and stratospheric ozone depletion.
Resumo:
Aeolian dust modelling has improved significantly over the last ten years and many institutions now consistently model dust uplift, transport and deposition in general circulation models (GCMs). However, the representation of dust in GCMs is highly variable between modelling communities due to differences in the uplift schemes employed and the representation of the global circulation that subsequently leads to dust deflation. In this study two different uplift schemes are incorporated in the same GCM. This approach enables a clearer comparison of the dust uplift schemes themselves, without the added complexity of several different transport and deposition models. The global annual mean dust aerosol optical depths (at 550 nm) using two different dust uplift schemes were found to be 0.014 and 0.023—both lying within the estimates from the AeroCom project. However, the models also have appreciably different representations of the dust size distribution adjacent to the West African coast and very different deposition at various sites throughout the globe. The different dust uplift schemes were also capable of influencing the modelled circulation, surface air temperature, and precipitation despite the use of prescribed sea surface temperatures. This has important implications for the use of dust models in AMIP-style (Atmospheric Modelling Intercomparison Project) simulations and Earth-system modelling.
Resumo:
We assess the roles of long-lived greenhouse gases and ozone depletion in driving meridional surface pressure gradients in the southern extratropics; these gradients are a defining feature of the Southern Annular Mode. Stratospheric ozone depletion is thought to have caused a strengthening of this mode during summer, with increasing long-lived greenhouse gases playing a secondary role. Using a coupled atmosphere-ocean chemistry-climate model, we show that there is cancelation between the direct, radiative effect of increasing greenhouse gases by the also substantial indirect—chemical and dynamical—feedbacks that greenhouse gases have via their impact on ozone. This sensitivity of the mode to greenhouse gas-induced ozone changes suggests that a consistent implementation of ozone changes due to long-lived greenhouse gases in climate models benefits the simulation of this important aspect of Southern Hemisphere climate.
Resumo:
Understanding how the emergence of the anthropogenic warming signal from the noise of internal variability translates to changes in extreme event occurrence is of crucial societal importance. By utilising simulations of cumulative carbon dioxide (CO2) emissions and temperature changes from eleven earth system models, we demonstrate that the inherently lower internal variability found at tropical latitudes results in large increases in the frequency of extreme daily temperatures (exceedances of the 99.9th percentile derived from pre-industrial climate simulations) occurring much earlier than for mid-to-high latitude regions. Most of the world's poorest people live at low latitudes, when considering 2010 GDP-PPP per capita; conversely the wealthiest population quintile disproportionately inhabit more variable mid-latitude climates. Consequently, the fraction of the global population in the lowest socio-economic quintile is exposed to substantially more frequent daily temperature extremes after much lower increases in both mean global warming and cumulative CO2 emissions.
Resumo:
Identifying the correct sense of a word in context is crucial for many tasks in natural language processing (machine translation is an example). State-of-the art methods for Word Sense Disambiguation (WSD) build models using hand-crafted features that usually capturing shallow linguistic information. Complex background knowledge, such as semantic relationships, are typically either not used, or used in specialised manner, due to the limitations of the feature-based modelling techniques used. On the other hand, empirical results from the use of Inductive Logic Programming (ILP) systems have repeatedly shown that they can use diverse sources of background knowledge when constructing models. In this paper, we investigate whether this ability of ILP systems could be used to improve the predictive accuracy of models for WSD. Specifically, we examine the use of a general-purpose ILP system as a method to construct a set of features using semantic, syntactic and lexical information. This feature-set is then used by a common modelling technique in the field (a support vector machine) to construct a classifier for predicting the sense of a word. In our investigation we examine one-shot and incremental approaches to feature-set construction applied to monolingual and bilingual WSD tasks. The monolingual tasks use 32 verbs and 85 verbs and nouns (in English) from the SENSEVAL-3 and SemEval-2007 benchmarks; while the bilingual WSD task consists of 7 highly ambiguous verbs in translating from English to Portuguese. The results are encouraging: the ILP-assisted models show substantial improvements over those that simply use shallow features. In addition, incremental feature-set construction appears to identify smaller and better sets of features. Taken together, the results suggest that the use of ILP with diverse sources of background knowledge provide a way for making substantial progress in the field of WSD.
Resumo:
To understand the biology and evolution of ruminants, the cattle genome was sequenced to about sevenfold coverage. The cattle genome contains a minimum of 22,000 genes, with a core set of 14,345 orthologs shared among seven mammalian species of which 1217 are absent or undetected in noneutherian (marsupial or monotreme) genomes. Cattle-specific evolutionary breakpoint regions in chromosomes have a higher density of segmental duplications, enrichment of repetitive elements, and species-specific variations in genes associated with lactation and immune responsiveness. Genes involved in metabolism are generally highly conserved, although five metabolic genes are deleted or extensively diverged from their human orthologs. The cattle genome sequence thus provides a resource for understanding mammalian evolution and accelerating livestock genetic improvement for milk and meat production.
Resumo:
CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider ( LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking - through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start- up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb(-1) or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, B-s production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb(-1) to 30 fb(-1). The Standard Model processes include QCD, B-physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z(0) boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2-6 describe examples of full analyses, with photons, electrons, muons, jets, missing E-T, B-mesons and tau's, and for quarkonia in heavy ion collisions. Chapters 7-15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model.