609 resultados para WIN-OWAS
Resumo:
Globalization is eroding the livelihoods of small farmers, a significant and vulnerable class, particularly in the developing world. The cost-price squeeze stemming from trade liberalization places farmers in a race to the bottom that leads to displacement, poverty, and environmental degradation. Scholars and activists have proposed that alternative trade initiatives offer a unique opportunity to reverse this trend by harnessing the power of the markets to reward producers of goods with embedded superior cultural, environmental, and social values. Alternative trade via certification schemes have become a de facto prescription for any location where there is a need to conciliate economic interest with conservation imperatives. Partnerships among commodity production farmers, elite manufacturers and wealthy northern consumers/activists do not necessarily have win-win outcomes. Paradoxically, the partnerships of farmers with external agencies have unexpected results. These partnerships develop into dependent relationships that become unsustainable in the absence of further transfers of capital. The institutions born of these partnerships are fragile. When these fledging institutions fail, farmers are left in the same situation that they were before the partnership, with only minor improvements to show after spending considerable amounts of social and financial capital. I hypothesize that these failures are born out of a belief in a universal understanding of sustainability. A discursive emphasis on consensus, equity and mutual benefit hides the fact that what for consumers it is a matter of choice, for producers is a matter of survival. The growth in consumers’ demand for certified products creates a race for farmers to meet these standards. My findings suggest that this race generates economically perverse effects. First, producers enter into a certification treadmill. Second, the local need for economic sustainability is ignored. Third, commodity based alternative trade schemes increase the exposure of communities to global shocks. I conclude by calling for a careful reassessment of sustainable development projects that promote certification schemes. The designers and implementers of these programs must include farmers’ agenda in the planning of these programs.
Resumo:
The essay - Managing Strategic Change – by K. Michael Haywood, Associate Professor, School of Hotel and Food Administration, University of Guelph, is initially characterized by Haywood as: “The ability to manage strategic change is critical for hospitality industry executives today. Executives must be capable of creating a vision of the future and implementing its direction. The author gives avenues for that management process.” “The effective management of strategic change is the major challenge confronting hospitality executives,” says Associate Professor Haywood. “Responding to a rapidly changing business environment and constantly evolving competitive threats and opportunities requires executives who can anticipate and plan for change.” According to Professor Haywood, the management of strategic change is a future imperative for hospitality executives. Implementing those changes will be even more difficult. “Survival and growth for many hospitality firms during the next decade will depend on the development of new strategic visions which can provide significant competitive advantages,” he says. “Strategies for managing costs and technology will be central to this task,” Haywood expands the thought. Haywood suggests two primary types of change hospitality executives should be aware of. First, is change that is anticipated, anticipatory change. Second, is the other more crucial type of change, strategic change in the face of crisis, or simply stated, reactive change. Professor Haywood describes the distinction between the two. In describing the approach that should be implemented in responding to an anticipatory change, Haywood says, “If time permits, and change is to be introduced gradually, pilots and trials should be run to assess the impact of the new strategy on the organization. These trials are used to create pockets of commitment throughout the corporation, build comfort levels with the new approach, and neutralize or win over potential opposition.” There are the obvious advantages to using an approach like the one described above, but there are disadvantages as well. Haywood discusses both. In addressing reactive change, Haywood offers that the process is a more - time is of the essence – condition, and that strong leadership and a firm hand on employee control is imperative. “Personal leadership, tough-mindedness, the willingness to ruthlessly abandon the familiar and the past, and the use of informal strategic levers are the hallmarks of sterling executive performance in such periods,” he says. “All these changes involve substantial technical, financial, and human risks,” Haywood wants you to know. “In order to make them, and still remain competitive, hospitality and travel-related corporations require executives capable of creating a vision of the future, able to sell that vision to their employees, and tough-minded enough to implement strategies to make the vision a reality.”
Resumo:
Women in hospitality organizations are moving up the corporate ladder at a pace significantly outdistancing their colleagues of a few decades ago, but women managers selectively perceive overt and covert discriminatory resistance, from chauvinism to carefully-contrived covert prejudicial treatment constructed to insure a no-win situation. The authors attempted to determine if these discriminatory practices against equally well-trained, qualified, and experienced hospitality women middle managers do affect their perception of their career growth as compared to male counterparts
Resumo:
In her piece entitled - Current Status Of Collectability Of Gaming-Related Credit Dollars - Ruth Lisa Wenof, Graduate Student at Florida International University initially states: “Credit is an important part of incentives used to lure gamblers to gaming establishments. However, a collection problem exists in casinos retrieving gaming-related credit losses of individuals living in states where gambling is illegal. The author discusses the history of this question, citing recent cases related to Atlantic City.” This author’s article is substantially laden with legal cases associated with casinos in New Jersey; Atlantic City to be exact. The piece is specific to the segment of the gaming industry that the title suggests, and as such is written in a decidedly technical style. “Legalized casino gaming, which was approved by the citizens of New Jersey on November 8, 1976, has been used as a unique tool of urban redevelopment for Atlantic City,” Wenof says in providing some background on this ‘Jersey shore municipality. “Since Resorts International opened its casino…revenues from gambling have increased rapidly. Resorts' gross win in 1978 was $134 million,” Wenof says. “Since then, the combined gross win of the city's 11 casinos has been just shy of $7.5 billion.” The author points out that the competition for casino business is fierce and that credit dollars play an integral role in soliciting such business. “Credit plays a most important part in every casino hotel. This type of gambler is given every incentive to come to a particular hotel,” says the author. “Airplanes, limousines, suites, free meals, and beverages all become a package for the person who can sign a marker. The credit department of a casino is similar to that of a bank. A banker who loans money knows that it must be paid back or his bank will fail. This is indeed true of a casino,” Wenof warns in outlining the potential problem that this article is fundamentally designed around. In providing further background on credit essentials and possible pitfalls, Wenof affords: “…on the Casino Control Act the State Commission of Investigation recommended to the legislature that casinos should not be allowed to extend credit at all, by reason of a concern for illicit diversion of revenues, which is popularly called skimming within the industry…” Although skimming is an after-the-fact problem, and is parenthetic to loan returns, it is an important element of the collective [sic] credit scheme. “A collection problem of prime importance is if a casino can get back gaming-related credit dollars advanced by the casino to a gambler who lives in a state where gambling is illegal,” is a central factor to consider, Wenof reveals. This is a primary focus of this article. Wenof touches on the social/societal implications of gambling, and then continues the discussion by citing a host of legal cases pertaining to debt collection.
Resumo:
The world's largest hotel, casino, and theme park has demonstrated that corporate responsibility to the community and corporate self-interest need not be mutually exclusive. MGM's human resource department established an employment outreach program that hired 1,462 economically disadvantaged persons from the community. This effort was a "win-win" situation for the both the community and the corporation and the hotel received a significant wage credit from the Job Training Partnership Act.
Resumo:
Using the securitization framework to highlight the arguments that facilitated the “War on Drugs”, this paper highlights a separate war against drug traffickers. Facilitated by ideology through the rhetoric promoted by the “War on Drugs,” the fear of communist expansion and democratic contraction, the “War on Drug Traffickers” was implemented, requiring its own strategy separate from the “War on Drugs.” This is an important distinction because the play on words changes the perception of the issue from one of drug addiction to one of weak institutions and insurgent/terrorist threat to those institutions. Furthermore, one cannot propose strategy to win, lose, or retreat in a war that one has been unable to identify properly. And while the all-encompassing “War on Drugs” has motivated tremendous discourse on its failure and possible solutions to remedy its failure, the generalizations made as a result of the inability to distinguish between the policies behind drug addiction and the militarized policies behind drug trafficking have discounted the effect of violence perpetrated by the state, the rationale for the state perpetrating that violence, and the dependence that the state has on foreign actors to perpetrate such violence. This makes it impossible to not only propose effective strategy but also to persuade states that participate in the “War on Drug Traffickers” to adopt the proposed strategy.
Resumo:
The missile's significance has been central to national security since the Soviet launching of Sputnik, and became increasingly important throughout the years of the Cold War. Much has been written about missile technology, but little has been written about how the development and deployment of this weapon affected Americans. The missile was developed to both deter war but also to win war. Its presence, however, was not always reassuring. Three areas of the United States are studied to evaluate the social implications of the missile during these pivotal years: San Francisco, home of multiple Nike installations; of Cape Canaveral, Florida, the nation's primary missile test center; the Great Plains, the location of the largest ICBM concentration in the country. Interviews were conducted, tours of facilities were taken, and local newspapers were reviewed. In conjunction with national newspapers and magazines and public opinion polls, this information provided a local social context for missile history. Nationally and locally, Americans both feared and praised the new technology. They were anxious for government funding in their cities and often felt that the danger the missile brought to their communities by making it as a Soviet target was justified in the larger cause for national security.
Resumo:
A uniform chronology for foraminifera-based sea surface temperature records has been established in more than 120 sediment cores obtained from the equatorial and eastern Atlantic up to the Arctic Ocean. The chronostratigraphy of the last 30,000 years is mainly based on published d18O records and 14C ages from accelerator mass spectrometry, converted into calendar-year ages. The high-precision age control provides the database necessary for the uniform reconstruction of the climate interval of the Last Glacial Maximum within the GLAMAP-2000 project.
Resumo:
Recent evidence suggests that the Subtropical Convergence (STC) zone east of New Zealand shifted little from its modern position along Chatham Rise during the last glaciation, and that offshore surface waters north of the STC zone cooled only slightly. However, at nearshore core site P69 (2195 m depth), 115 km off the east coast of North Island and ca 300 km north of the modern STC zone, planktonic foraminiferal species, transfer function data and stable oxygen and carbon isotope records suggest that surface waters were colder by up to 6°C during the late last glacial period compared to the Holocene, and included a strong upwelling signature. Presently site P69 is bathed by south-flowing subtropical waters in the East Cape Current. The nearshore western end of Chatham Rise supports a major bathymetric depression, the Mernoo Saddle, through which some exchange between northern subtropical and southern subantarctic water presently occurs. It is proposed that as a result of much intensified current flows south of the Rise during the last glaciation, a consequence of more compressed subantarctic water masses, lowered sea level, and an expanded and stronger Westerly Wind system, there was accelerated leakage northwards of both Australasian Subantarctic Water and upwelled Antarctic Intermediate Water over Mernoo Saddle in a modified and intensified Southland Current. The expanded cold water masses displaced the south-flowing warm East Cape Current off southeastern North Island, and offshore divergence was accompanied by wind-assisted upwelling of nutrient-rich waters in the vicinity of P69. A comparable kind of inshore cold water jetting possibly characterised most glacial periods since the latest Miocene, and may account for the occasional occurrence of subantarctic marine fossils in onland late Cenozoic deposits north of the STC zone, rather than invoking wholesale major oscillations of the oceanic STC itself.
Resumo:
We present 30 new planktonic foraminiferal census data of surface sediment samples from the South China Sea, recovered between 630 and 2883 m water depth. These new data, together with the 131 earlier published data sets from the western Pacific, are used for calibrating the SIMMAX-28 transfer function to estimate past sea-surface temperatures. This regional SIMMAX method offers a slightly better understanding of the marginal sea conditions of the South China Sea than the linear transfer function FP-12E, which is based only on open-ocean data. However, both methods are biased toward the tropical temperature regime because of the very limited data from temperate to subpolar regions. The SIMMAX formula was applied to sediment core 17940 from the northeastern South China Sea, with sedimentation rates of 20-80 cm/ka. Results revealed nearly unchanged summer temperatures around 28°C for the last 30 ky, while winter temperatures varied between 19.5°C in the last glacial maximum and 26°C during the Holocene. During Termination 1A, the winter estimates show a Younger Dryas cooling by 3°C subsequent to a temperature optimum of 24°C during the Bölling=Alleröd. Estimates of winter temperature differences between 0 and 100 m water depth document the seasonal variations in the thickness of the mixed layer and provide a new proxy for estimating past changes in the strength of the winter monsoon.
Resumo:
Software bug analysis is one of the most important activities in Software Quality. The rapid and correct implementation of the necessary repair influence both developers, who must leave the fully functioning software, and users, who need to perform their daily tasks. In this context, if there is an incorrect classification of bugs, there may be unwanted situations. One of the main factors to be assigned bugs in the act of its initial report is severity, which lives up to the urgency of correcting that problem. In this scenario, we identified in datasets with data extracted from five open source systems (Apache, Eclipse, Kernel, Mozilla and Open Office), that there is an irregular distribution of bugs with respect to existing severities, which is an early sign of misclassification. In the dataset analyzed, exists a rate of about 85% bugs being ranked with normal severity. Therefore, this classification rate can have a negative influence on software development context, where the misclassified bug can be allocated to a developer with little experience to solve it and thus the correction of the same may take longer, or even generate a incorrect implementation. Several studies in the literature have disregarded the normal bugs, working only with the portion of bugs considered severe or not severe initially. This work aimed to investigate this portion of the data, with the purpose of identifying whether the normal severity reflects the real impact and urgency, to investigate if there are bugs (initially classified as normal) that could be classified with other severity, and to assess if there are impacts for developers in this context. For this, an automatic classifier was developed, which was based on three algorithms (Näive Bayes, Max Ent and Winnow) to assess if normal severity is correct for the bugs categorized initially with this severity. The algorithms presented accuracy of about 80%, and showed that between 21% and 36% of the bugs should have been classified differently (depending on the algorithm), which represents somewhere between 70,000 and 130,000 bugs of the dataset.
Resumo:
Foraminifera counts and climatic assemblages from the Tore Seamount are used to approach the glacial and interglacial changes in temperature and productivity on the Iberian Margin over the last 225 kyr. Chronostratigraphy is based on Globigerinoides ruber and Globigerina bulloides oxygen isotopes and supported by foraminifera and carbonate stadial fluctuations. Foraminifera indicate cooling from late interglacial stage 5 to the beginning of Termination I (TI). Neogloboquadnna pachyderma-s reflects cold conditions during glacial stages 4-2. In contrast, glacial stage 6 is dominated by warmer N. pachyderma-d and dutertrei and a restricted arctic assemblage. Past sea surface temperatures confirm the general cooling, reaching 4.3°C (SIMMAX.28) during stage 2. Multiple productivity proxies such as organic carbon, productivity-related foraminifera, and delta13C constrain the changes observed. A productivity increase occurs after interglacial stage 5, enhanced from late glacial stage 3 to TI Present-day satellite-detected phytoplankton plumes off Portugal would have accounted in the past glacial stages for the general productivity increase over the Tore. On top of this, welldefined peaks of organic carbon and productivity-related foraminifera correspond with Heinrich events 1-4.
Resumo:
Lake Meerfelder Maar (Germany) provides a varved record from the Last Glacial/Interglacial transition back to ca 1500 years BP. This study shows results for the Holocene sequence from new cores collected in 2009 based on varve counting, microfacies and micro-XRF analyses. The main goal of combining those analyses is to provide a new approach for interpreting long-term palaeolimnological proxy data and testing the climate-proxy stationarity throughout the current interglacial period. Varve counting provides a new independent Holocene chronology (MFM2012) with an estimated counting error of 1-0.5% and supported by 14C dating. Varve structure and thickness and geochemical composition of the varves give information about the main environmental processes that affect the lake and its catchment as well as the possible climate variability behind. Varves are couplets of i) a spring/summer laminae composed of monospecific diatom blooms and ii) an autumn/winter sub-layer made of minerogenic material and re-worked sediments. Thickness of the varves and sub-layers reflect lake variability and allow seasons to be distinguished as well as seasonal proxies. Changes in the winter minerogenic influx into the lake are reflected by Ti intensities and the Si/Ti ratio as a indicator for diatom concentration, which can be used as a proxy for water circulation during the early spring. Long-term variability of geochemical composition shows a reduction of the detrital material input (Ti) at 5,000 varve yrs BP and a visible sensitivity to water mixing (Si/Ti) during the Late Holocene. Variations of Ti intensities during the early and mid-Holocene do not show a clear relationship with climate. In contrast, higher values of the Si/Ti ratio together with thicker varves have been interpreted as wind-stress phases, which coincide with centennial variability of European cold/wet episodes during the Late Holocene. Our findings show that a long-term change in the lake and/or variability of the climate system can influence proxy sensitivity of a lacustrine record.
Resumo:
A recently developed technique for determining past sea surface temperatures (SST), based on an analysis of the unsaturation ratio of long chain C37 methyl alkenones produced by Prymnesiophyceae phytoplankton (U37 k' ), has been applied to an upper Quaternary sediment core from the equatorial Atlantic. U37 k' temperature estimates were compared to those obtained from delta18O of the planktonic foraminifer Globigerinoides sacculifer and of planktonic foraminiferal assemblages for the last glacial cycle. The alkenone method showed 1.8°C cooling at the last glacial maximum, about 1/2 to 1/3 of the decrease shown by the isotopic method (6.3°C) and foraminiferal modern analogue technique estimates for the warm season (3.8°C). Warm season foraminiferal assemblage estimates based on transfer functions are out of phase with the other estimates, showing a 1.4°C drop at the last glacial maximum with an additional 0.9°C drop in the deglaciation. Increased alkenone abundances, total organic carbon percentage and foraminiferal accumulation rates in the last glaciation indicate an increase in productivity of as much as 4 times over present day. These changes are thought to be due to increased upwelling caused by enhanced winds during the glaciation. If U37 k' estimates are correct, as much as 50-70% (up to 4.5°C) of estimated delta18O and modern analogue temperature changes in the last glaciation may have been due to changes in thermocline depth, whereas transfer functions seem more strongly influenced by seasonality changes. This indicates these estimates may be influenced as strongly by other factors as they are by SST, which in the equatorial Atlantic was only reduced slightly in the last glaciation.
Resumo:
Planktonic foraminiferal census counts are used to construct high-resolution sea surface temperature (SST) and subsurface (thermocline) temperature records at a core site in the Tobago Basin, Lesser Antilles. The record is used to document climatic variability at this tropical site in comparison to middle- and high-latitude sites and to test current concepts of cross-equatorial heat transports as a major player in interhemispheric climate variability. Temperatures are estimated using transfer function and modern analog techniques. Glacial - maximum cooling of 2.5°-3°C is indicated; maximum cooling by 4°C is inferred for isotope stage 3. The SST record displays millennial-scale variability with temperature jumps of up to 3°C and closely tracks the structure of ice-core Dansgaard/Oeschger cycles. SST variations in part of the record run opposite to the SST evolution at high northern latitude sites, pointing to thermohaline circulation and marine heat transport as an important factor driving SST in the tropical and high-latitude Atlantic, both on orbital and suborbital timescales.