906 resultados para short-term profit optimization
Resumo:
In an economy where cash can be stored costlessly (in nominal terms), the nominal interest rate is bounded below by zero. This paper derives the implications of this nonnegativity constraint for the term structure and shows that it induces a nonlinear and convex relation between short- and long-term interest rates. As a result, the long-term rate responds asymmetrically to changes in the short-term rate, and by less than predicted by a benchmark linear model. In particular, a decrease in the short-term rate leads to a decrease in the long-term rate that is smaller in magnitude than the increase in the long-term rate associated with an increase in the short-term rate of the same size. Up to the extent that monetary policy acts by affecting long-term rates through the term structure, its power is considerably reduced at low interest rates. The empirical predictions of the model are examined using data from Japan.
Resumo:
L’autorisation de recourir à la force est une pratique par laquelle le Conseil de sécurité permet à des États membres des Nations Unies ou à des accords ou organismes régionaux, voire au Secrétaire général des Nations Unies de recourir à la coercition militaire. Elle est l’une des circonstances excluant l’illicéité face à l’interdiction de recourir à la force dans les relations internationales dont la règle est posée à l’article 2,§ 4 de la Charte des Nations Unies. Il est évident que cette pratique ne correspond pas clairement à la lettre de la Charte mais elle tire sa légitimité du fait qu’elle permet au Conseil de sécurité de s’acquitter de sa mission principale de maintien de la paix et de la sécurité internationales, étant donné que le système de coercition militaire prévu par la Charte s’avère inapplicable dans la pratique. Il reste que cette pratique est empreinte d’ambiguïté : elle apparaît tantôt comme une intervention des Nations Unies, tantôt comme une action unilatérale au profit de certaines puissances capables de mener des opérations de grande envergure. Cette ambiguïté est encore exacerbée par le problème de l’autorisation présumée que certainsÉtats pourraient déduire des actes du Conseil de sécurité, pour intervenir dans divers conflits. Dans les faits, la pratique de l’autorisation de recourir à la force semble actualiser une tendance belliciste qui caractérisait les époques antérieures. Elle peut, si l’on n’y prend garde, refondre, par pans entiers, les legs du droit contre la guerre (jus contra bellum) issu du XXème siècle, droit qui a été le fruit de longues tribulations dans l’histoire des relations internationales. Le danger le plus grave est que des acquis chèrement négociés risquent d’être jetés par-dessus bord avec trop de facilité et sans délai, pour servir des visées à court terme.
Resumo:
Les cas d’entreprises touchées par des scandales financiers, environnementaux ou concernant des conditions de travail abusives imposées à leur main-d’œuvre, n’ont cessé de jalonner l’actualité ces vingt dernières années. La multiplication des comportements à l’origine de ces scandales s’explique par l’environnement moins contraignant, que leur ont offert les politiques de privatisation, dérégulation et libéralisation, amorcées à partir des années 1980. Le développement de la notion de responsabilité sociale des entreprises à partir des années 1980, en réaction à ces excès, incarne l'idée que si une entreprise doit certes faire des profits et les pérenniser elle se doit de les réaliser en favorisant les comportements responsables, éthiques et transparents avec toutes ses parties prenantes. Nous analysons dans cette thèse le processus par lequel, face à des dysfonctionnements et abus, touchant les conditions de travail de leur main d’œuvre ou leur gouvernance, des entreprises peuvent être amenées, ou non, à questionner et modifier leurs pratiques. Nous avons axé notre étude de cas sur deux entreprises aux trajectoires diamétralement opposées. La première entreprise, issue du secteur de la fabrication de vêtements et dont la crise concernait des atteintes aux droits des travailleurs, a surmonté la crise en réformant son modèle de production. La seconde entreprise, située dans le secteur des technologies de l'information et de la communication, a fait face à une crise liée à sa gouvernance d’entreprise, multiplié les dysfonctionnements pendant dix années de crises et finalement déclaré faillite en janvier 2009. Les évolutions théoriques du courant néo-institutionnel ces dernières années, permettent d’éclairer le processus par lequel de nouvelles normes émergent et se diffusent, en soulignant le rôle de différents acteurs, qui pour les uns, définissent de nouvelles normes et pour d’autres se mobilisent en vue de les diffuser. Afin d’augmenter leur efficacité à l’échelle mondiale, il apparaît que ces acteurs agissent le plus souvent en réseaux, parfois concurrents. L’étude du cas de cette compagnie du secteur de la confection de vêtement nous a permis d’aborder le domaine lié aux conditions de travail de travailleurs œuvrant au sein de chaînes de production délocalisées dans des pays aux lois sociales absentes ou inefficaces. Nous avons analysé le cheminement par lequel cette entreprise fut amenée à considérer, avec plus de rigueur, la dimension éthique dans sa chaîne de production. L’entreprise, en passant par différentes étapes prenant la forme d’un processus d’apprentissage organisationnel, a réussi à surmonter la crise en réformant ses pratiques. Il est apparu que ce processus ne fut pas spontané et qu’il fut réalisé suite aux rôles joués par deux types d’acteurs. Premièrement, par la mobilisation incessante des mouvements de justice globale afin que l’entreprise réforme ses pratiques. Et deuxièmement, par le cadre normatif et le lieu de dialogue entre les différentes parties prenantes, fournis par un organisme privé source de normes. C’est fondamentalement le risque de perdre son accréditation à la cet organisme qui a poussé l’entreprise à engager des réformes. L’entreprise est parvenue à surmonter la crise, certes en adoptant et en respectant les normes définies par cette organisation mais fondamentalement en modifiant sa culture d'entreprise. Le leadership du CEO et du CFO a en effet permis la création d'une culture d'entreprise favorisant la remise en question, le dialogue et une plus grande prise en considération des parties prenantes, même si la gestion locale ne va pas sans poser parfois des difficultés de mise en œuvre. Concernant le domaine de la gouvernance d’entreprise, nous mettons en évidence, à travers l’étude des facteurs ayant mené au déclin et à la faillite d’une entreprise phare du secteur des technologies de l’information et de la communication, les limites des normes en la matière comme outil de bonne gouvernance. La légalité de la gestion comptable et la conformité de l’entreprise aux normes de gouvernance n'ont pas empêché l’apparition et la multiplication de dysfonctionnements et abus stratégiques et éthiques. Incapable de se servir des multiples crises auxquelles elle a fait face pour se remettre en question et engager un apprentissage organisationnel profond, l'entreprise s'est focalisée de manière obsessionnelle sur la rentabilité à court terme et la recherche d'un titre boursier élevé. La direction et le conseil d'administration ont manqué de leadership afin de créer une culture d'entreprise alliant innovation technologique et communication honnête et transparente avec les parties prenantes. Alors que l'étude consacrée à l’entreprise du secteur de la confection de vêtement illustre le cas d'une entreprise qui a su, par le biais d'un changement stratégique, relever les défis que lui imposait son environnement, l'étude des quinze dernières années de la compagnie issue du secteur des technologies de l’information et de la communication témoigne de la situation inverse. Il apparaît sur base de ces deux cas que si une gouvernance favorisant l'éthique et la transparence envers les parties prenantes nécessite la création d'une culture d'entreprise valorisant ces éléments, elle doit impérativement soutenir et être associée à une stratégie adéquate afin que l'entreprise puisse pérenniser ses activités.
Resumo:
Land managers often respond to declining numbers of target species by creating additional areas of habitat. If these habitats are also subject to human disturbance, then their efforts may be wasted. The European Nightjar (Caprimulgus europaeus) is a ground-nesting bird that is listed as a species of European Conservation Concern. It appears to be susceptible to human disturbance during the breeding season. We examined habitat use and reproductive success over 10 years in a breeding population on 1335 ha of managed land in Nottinghamshire, England. The study site was divided into a heavily disturbed section and a less disturbed section of equal habitat availability, forming a natural long-term experiment. The site is open to the public, and visitor numbers approximately doubled during the study. We found that overall Nightjar density was significantly lower and there were significantly fewer breeding pairs in the heavily disturbed habitat compared with the less disturbed habitat. However, average breeding success per pair, in terms of eggs and fledglings produced, was not significantly different between the two sections across years. Our findings suggest that human recreational disturbance may drastically alter settlement patterns and nest site selection of arriving females in some migratory ground-nesting species and may reduce the utility of apparently suitable patches of remnant and created habitat. Land managers should bear this in mind when creating new areas of habitat that will also be accessible to the public. Our study also highlights the value of long-term population monitoring, which can detect trends that short-term studies may miss.
Resumo:
Even if we have recognized many short-term benefits of agile methods, we still know very little about their long-term effects. In this panel, we discuss the long-term perspective of the agile methods. The panelists are either industrial or academic representatives. They will discuss problems and benefits related to the long-term lifecycle system management in agile projects. Ideally, the panel’s outcome will provide ideas for future research.
Resumo:
The effect of long-term knowledge upon performance in short-term memory tasks was examined for children from 5 to 10 years of age. The emergence of a lexicality effect, in which familiar words were recalled more accurately than unfamiliar words, was found to depend upon the nature of the memory task. Lexicality effects were interpreted as reflecting the use of redintegration, or reconstruction processes, in short-term memory. Redintegration increased with age for tasks requiring spoken item recall and decreased with age when position information but not naming was required. In a second experiment, redintegration was found in a recognition task when some of the foils rhymed with the target. Older children were able to profit from a rhyming foil, whereas younger children were confused by it, suggesting that the older children make use of sublexical phonological information in reconstructing the target. It was proposed that redintegrative processes in their mature form support the reconstruction of detailed phonological knowledge of words.
Resumo:
price-earnings ratio;value premium;arbitrage trading rule;UK stock returns;contrarian investment Abstract: The price-earnings effect has been thoroughly documented and is the subject of numerous academic studies. However, in existing research it has almost exclusively been calculated on the basis of the previous year's earnings. We show that the power of the effect has until now been seriously underestimated due to taking too short-term a view of earnings. Looking at all UK companies since 1975, using the traditional P/E ratio we find the difference in average annual returns between the value and glamour deciles to be 6%. This is similar to other authors' findings. We are able to almost double the value premium by calculating the P/E ratio using earnings averaged over the previous eight years.
Resumo:
This paper follows the report on the “Quality of Urban Design: Study of the Influence of Private Property Decision Maker in Urban Design” (RICS 1996). It focuses on one of the findings in the report, namely that decisions made in development, investment and occupation seemed overly influenced by short term considerations. In this paper, the authors review the Report and examine the concept of short termism as it affects urban design decisions. The paper concludes that although it is difficult to establish whether or not short termism exists in many decisions, there are grounds for believing that a priori short termism might particularly influence property orientated decisions. The paper ends with some implications for policy both at the economy and local level.
Resumo:
A first-of-a-kind, extended-term cloud aircraft campaign was conducted to obtain an in-situ statistical characterization of continental boundary-layer clouds needed to investigate cloud processes and refine retrieval algorithms. Coordinated by the Atmospheric Radiation Measurement (ARM) Aerial Facility (AAF), the Routine AAF Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations (RACORO) field campaign operated over the ARM Southern Great Plains (SGP) site from 22 January to 30 June 2009, collecting 260 h of data during 59 research flights. A comprehensive payload aboard the Center for Interdisciplinary Remotely-Piloted Aircraft Studies (CIRPAS) Twin Otter aircraft measured cloud microphysics, solar and thermal radiation, physical aerosol properties, and atmospheric state parameters. Proximity to the SGP's extensive complement of surface measurements provides ancillary data that supports modeling studies and facilitates evaluation of a variety of surface retrieval algorithms. The five-month duration enabled sampling a range of conditions associated with the seasonal transition from winter to summer. Although about two-thirds of the cloud flights occurred in May and June, boundary-layer cloud fields were sampled under a variety of environmental and aerosol conditions, with about 77% of the flights occurring in cumulus and stratocumulus. Preliminary analyses illustrate use of these data to analyze cloud-aerosol relationships, characterize the horizontal variability of cloud radiative impacts, and evaluate surface-based retrievals. We discuss how an extended-term campaign requires a simplified operating paradigm that is different from that used for typical, short-term, intensive aircraft field programs.
Resumo:
Observational evidence is scarce concerning the distribution of plant pathogen population sizes or densities as a function of time-scale or spatial scale. For wild pathosystems we can only get indirect evidence from evolutionary patterns and the consequences of biological invasions.We have little or no evidence bearing on extermination of hosts by pathogens, or successful escape of a host from a pathogen. Evidence over the last couple of centuries from crops suggest that the abundance of particular pathogens in the spectrum affecting a given host can vary hugely on decadal timescales. However, this may be an artefact of domestication and intensive cultivation. Host-pathogen dynamics can be formulated mathematically fairly easily–for example as SIR-type differential equation or difference equation models, and this has been the (successful) focus of recent work in crops. “Long-term” is then discussed in terms of the time taken to relax from a perturbation to the asymptotic state. However, both host and pathogen dynamics are driven by environmental factors as well as their mutual interactions, and both host and pathogen co-evolve, and evolve in response to external factors. We have virtually no information about the importance and natural role of higher trophic levels (hyperpathogens) and competitors, but they could also induce long-scale fluctuations in the abundance of pathogens on particular hosts. In wild pathosystems the host distribution cannot be modelled as either a uniform density or even a uniform distribution of fields (which could then be treated as individuals). Patterns of short term density-dependence and the detail of host distribution are therefore critical to long-term dynamics. Host density distributions are not usually scale-free, but are rarely uniform or clearly structured on a single scale. In a (multiply structured) metapopulation with coevolution and external disturbances it could well be the case that the time required to attain equilibrium (if it exists) based on conditions stable over a specified time-scale is longer than that time-scale. Alternatively, local equilibria may be reached fairly rapidly following perturbations but the meta-population equilibrium be attained very slowly. In either case, meta-stability on various time-scales is a more relevant than equilibrium concepts in explaining observed patterns.
Resumo:
Many studies evaluating model boundary-layer schemes focus either on near-surface parameters or on short-term observational campaigns. This reflects the observational datasets that are widely available for use in model evaluation. In this paper we show how surface and long-term Doppler lidar observations, combined in a way to match model representation of the boundary layer as closely as possible, can be used to evaluate the skill of boundary-layer forecasts. We use a 2-year observational dataset from a rural site in the UK to evaluate a climatology of boundary layer type forecast by the UK Met Office Unified Model. In addition, we demonstrate the use of a binary skill score (Symmetric Extremal Dependence Index) to investigate the dependence of forecast skill on season, horizontal resolution and forecast leadtime. A clear diurnal and seasonal cycle can be seen in the climatology of both the model and observations, with the main discrepancies being the model overpredicting cumulus capped and decoupled stratocumulus capped boundary-layers and underpredicting well mixed boundary-layers. Using the SEDI skill score the model is most skillful at predicting the surface stability. The skill of the model in predicting cumulus capped and stratocumulus capped stable boundary layer forecasts is low but greater than a 24 hr persistence forecast. In contrast, the prediction of decoupled boundary-layers and boundary-layers with multiple cloud layers is lower than persistence. This process based evaluation approach has the potential to be applied to other boundary-layer parameterisation schemes with similar decision structures.
Resumo:
Measurements of the ionospheric E region during total solar eclipses in the period 1932-1999 have been used to investigate the fraction of Extreme Ultra Violet and soft X-ray radiation, phi, that is emitted from the limb corona and chromosphere. The relative apparent sizes of the Moon and the Sun are different for each eclipse, and techniques are presented which correct the measurements and, therefore, allow direct comparisons between different eclipses. The results show that the fraction of ionising radiation emitted by the limb corona has a clear solar cycle variation and that the underlying trend shows this fraction has been increasing since 1932. Data from the SOHO spacecraft are used to study the effects of short-term variability and it is shown that the observed long-term rise in phi has a negligible probability of being a chance occurrence.
Resumo:
This study investigates the effects of a short-term pedagogic intervention on the development of L2 fluency among learners studying English for Academic purposes (EAP) at a university in the UK. It also examines the interaction between the development of fluency, and complexity and accuracy. Through a pre-test, post-test design, data were collected over a period of four weeks from learners performing monologic tasks. While the Control Group (CG) focused on developing general speaking and listening skills, the Experimental Group (EG) received awareness-raising activities and fluency strategy training in addition to general speaking and listening practice i.e following the syllabus. The data, coded in terms of a range of measures of fluency, accuracy and complexity, were subjected to repeated measures MANOVA, t-tests and correlations. The results indicate that after the intervention, while some fluency gains were achieved by the CG, the EG produced statistically more fluent language demonstrating a faster speech and articulation rate, longer runs and higher phonation time ratios. The significant correlations obtained between measures of accuracy and learners’ pauses in the CG suggest that pausing opportunities may have been linked to accuracy. The findings of the study have significant implications for L2 pedagogy, highlighting the effective impact of instruction on the development of fluency.
Resumo:
This paper presents a summary of the work done within the European Union's Seventh Framework Programme project ECLIPSE (Evaluating the Climate and Air Quality Impacts of Short-Lived Pollutants). ECLIPSE had a unique systematic concept for designing a realistic and effective mitigation scenario for short-lived climate pollutants (SLCPs; methane, aerosols and ozone, and their precursor species) and quantifying its climate and air quality impacts, and this paper presents the results in the context of this overarching strategy. The first step in ECLIPSE was to create a new emission inventory based on current legislation (CLE) for the recent past and until 2050. Substantial progress compared to previous work was made by including previously unaccounted types of sources such as flaring of gas associated with oil production, and wick lamps. These emission data were used for present-day reference simulations with four advanced Earth system models (ESMs) and six chemistry transport models (CTMs). The model simulations were compared with a variety of ground-based and satellite observational data sets from Asia, Europe and the Arctic. It was found that the models still underestimate the measured seasonality of aerosols in the Arctic but to a lesser extent than in previous studies. Problems likely related to the emissions were identified for northern Russia and India, in particular. To estimate the climate impacts of SLCPs, ECLIPSE followed two paths of research: the first path calculated radiative forcing (RF) values for a large matrix of SLCP species emissions, for different seasons and regions independently. Based on these RF calculations, the Global Temperature change Potential metric for a time horizon of 20 years (GTP20) was calculated for each SLCP emission type. This climate metric was then used in an integrated assessment model to identify all emission mitigation measures with a beneficial air quality and short-term (20-year) climate impact. These measures together defined a SLCP mitigation (MIT) scenario. Compared to CLE, the MIT scenario would reduce global methane (CH4) and black carbon (BC) emissions by about 50 and 80 %, respectively. For CH4, measures on shale gas production, waste management and coal mines were most important. For non-CH4 SLCPs, elimination of high-emitting vehicles and wick lamps, as well as reducing emissions from gas flaring, coal and biomass stoves, agricultural waste, solvents and diesel engines were most important. These measures lead to large reductions in calculated surface concentrations of ozone and particulate matter. We estimate that in the EU, the loss of statistical life expectancy due to air pollution was 7.5 months in 2010, which will be reduced to 5.2 months by 2030 in the CLE scenario. The MIT scenario would reduce this value by another 0.9 to 4.3 months. Substantially larger reductions due to the mitigation are found for China (1.8 months) and India (11–12 months). The climate metrics cannot fully quantify the climate response. Therefore, a second research path was taken. Transient climate ensemble simulations with the four ESMs were run for the CLE and MIT scenarios, to determine the climate impacts of the mitigation. In these simulations, the CLE scenario resulted in a surface temperature increase of 0.70 ± 0.14 K between the years 2006 and 2050. For the decade 2041–2050, the warming was reduced by 0.22 ± 0.07 K in the MIT scenario, and this result was in almost exact agreement with the response calculated based on the emission metrics (reduced warming of 0.22 ± 0.09 K). The metrics calculations suggest that non-CH4 SLCPs contribute ~ 22 % to this response and CH4 78 %. This could not be fully confirmed by the transient simulations, which attributed about 90 % of the temperature response to CH4 reductions. Attribution of the observed temperature response to non-CH4 SLCP emission reductions and BC specifically is hampered in the transient simulations by small forcing and co-emitted species of the emission basket chosen. Nevertheless, an important conclusion is that our mitigation basket as a whole would lead to clear benefits for both air quality and climate. The climate response from BC reductions in our study is smaller than reported previously, possibly because our study is one of the first to use fully coupled climate models, where unforced variability and sea ice responses cause relatively strong temperature fluctuations that may counteract (and, thus, mask) the impacts of small emission reductions. The temperature responses to the mitigation were generally stronger over the continents than over the oceans, and with a warming reduction of 0.44 K (0.39–0.49) K the largest over the Arctic. Our calculations suggest particularly beneficial climate responses in southern Europe, where surface warming was reduced by about 0.3 K and precipitation rates were increased by about 15 (6–21) mm yr−1 (more than 4 % of total precipitation) from spring to autumn. Thus, the mitigation could help to alleviate expected future drought and water shortages in the Mediterranean area. We also report other important results of the ECLIPSE project.
Resumo:
Model simulations of the next few decades are widely used in assessments of climate change impacts and as guidance for adaptation. Their non-linear nature reveals a level of irreducible uncertainty which it is important to understand and quantify, especially for projections of near-term regional climate. Here we use large idealised initial condition ensembles of the FAMOUS global climate model with a 1 %/year compound increase in CO2 levels to quantify the range of future temperatures in model-based projections. These simulations explore the role of both atmospheric and oceanic initial conditions and are the largest such ensembles to date. Short-term simulated trends in global temperature are diverse, and cooling periods are more likely to be followed by larger warming rates. The spatial pattern of near-term temperature change varies considerably, but the proportion of the surface showing a warming is more consistent. In addition, ensemble spread in inter-annual temperature declines as the climate warms, especially in the North Atlantic. Over Europe, atmospheric initial condition uncertainty can, for certain ocean initial conditions, lead to 20 year trends in winter and summer in which every location can exhibit either strong cooling or rapid warming. However, the details of the distribution are highly sensitive to the ocean initial condition chosen and particularly the state of the Atlantic meridional overturning circulation. On longer timescales, the warming signal becomes more clear and consistent amongst different initial condition ensembles. An ensemble using a range of different oceanic initial conditions produces a larger spread in temperature trends than ensembles using a single ocean initial condition for all lead times. This highlights the potential benefits from initialising climate predictions from ocean states informed by observations. These results suggest that climate projections need to be performed with many more ensemble members than at present, using a range of ocean initial conditions, if the uncertainty in near-term regional climate is to be adequately quantified.