988 resultados para legacy system
Resumo:
The Cold War in the late 1940s blunted attempts by the Truman administration to extend the scope of government in areas such as health care and civil rights. In California, the combined weakness of the Democratic Party in electoral politics and the importance of fellow travelers and communists in state liberal politics made the problem of how to advance the left at a time of heightened Cold War tensions particularly acute. Yet by the early 1960s a new generation of liberal politicians had gained political power in the Golden State and was constructing a greatly expanded welfare system as a way of cementing their hold on power. In this article I argue that the New Politics of the 1970s, shaped nationally by Vietnam and by the social upheavals of the 1960s over questions of race, gender, sexuality, and economic rights, possessed particular power in California because many activists drew on the longer-term experiences of a liberal politics receptive to earlier anti-Cold War struggles. A desire to use political involvement as a form of social networking had given California a strong Popular Front, and in some respects the power of new liberalism was an offspring of those earlier battles.
Resumo:
Air accidents represent a small proportion of the flights registered worldwide. Airplane collisions in the air are rare. In September of 2006, a Boeing 737-800 collided in midair with a Legacy Jet. It was the largest accident registered in the history of Brazilian aviation until that time. The present study explores aspects of press coverage of the accident. Data and information reported in the media about the accident from September 2006 to August 2007 were collected and discussed. Media coverage called attention to two unusual aspects: politicisation of the discussion, culminating in the opening of congressional inquiries, and equally the concomitance of police investigations interfering in the work of agencies responsible for the official accident investigation. Emphasis on assigning guilt and establishing penalties may close the windows of opportunity an accident had opened for discussions on the improvement of air safety. In Brazil, political imperatives and organizational pressures have interfered and the possibilities of organizational learning from the accident have been drastically curtailed.
Resumo:
In soil surveys, several sampling systems can be used to define the most representative sites for sample collection and description of soil profiles. In recent years, the conditioned Latin hypercube sampling system has gained prominence for soil surveys. In Brazil, most of the soil maps are at small scales and in paper format, which hinders their refinement. The objectives of this work include: (i) to compare two sampling systems by conditioned Latin hypercube to map soil classes and soil properties; (II) to retrieve information from a detailed scale soil map of a pilot watershed for its refinement, comparing two data mining tools, and validation of the new soil map; and (III) to create and validate a soil map of a much larger and similar area from the extrapolation of information extracted from the existing soil map. Two sampling systems were created by conditioned Latin hypercube and by the cost-constrained conditioned Latin hypercube. At each prospection place, soil classification and measurement of the A horizon thickness were performed. Maps were generated and validated for each sampling system, comparing the efficiency of these methods. The conditioned Latin hypercube captured greater variability of soils and properties than the cost-constrained conditioned Latin hypercube, despite the former provided greater difficulty in field work. The conditioned Latin hypercube can capture greater soil variability and the cost-constrained conditioned Latin hypercube presents great potential for use in soil surveys, especially in areas of difficult access. From an existing detailed scale soil map of a pilot watershed, topographical information for each soil class was extracted from a Digital Elevation Model and its derivatives, by two data mining tools. Maps were generated using each tool. The more accurate of these tools was used for extrapolation of soil information for a much larger and similar area and the generated map was validated. It was possible to retrieve the existing soil map information and apply it on a larger area containing similar soil forming factors, at much low financial cost. The KnowledgeMiner tool for data mining, and ArcSIE, used to create the soil map, presented better results and enabled the use of existing soil map to extract soil information and its application in similar larger areas at reduced costs, which is especially important in development countries with limited financial resources for such activities, such as Brazil.
Resumo:
The time course of lake recovery after a reduction in external loading of nutrients is often controlled by conditions in the sediment. Remediation of eutrophication is hindered by the presence of legacy organic carbon deposits, that exert a demand on the terminal electron acceptors of the lake and contribute to problems such as internal nutrient recycling, absence of sediment macrofauna, and flux of toxic metal species into the water column. Being able to quantify the timing of a lake’s response requires determination of the magnitude and lability, i.e., the susceptibility to biodegradation, of the organic carbon within the legacy deposit. This characterization is problematic for organic carbon in sediments because of the presence of different fractions of carbon, which vary from highly labile to refractory. The lability of carbon under varied conditions was tested with a bioassay approach. It was found that the majority of the organic material found in the sediments is conditionally-labile, where mineralization potential is dependent on prevailing conditions. High labilities were noted under oxygenated conditions and a favorable temperature of 30 °C. Lability decreased when oxygen was removed, and was further reduced when the temperature was dropped to the hypolimnetic average of 8° C . These results indicate that reversible preservation mechanisms exist in the sediment, and are able to protect otherwise labile material from being mineralized under in situ conditions. The concept of an active sediment layer, a region in the sediments in which diagenetic reactions occur (with nothing occurring below it), was examined through three lines of evidence. Initially, porewater profiles of oxygen, nitrate, sulfate/total sulfide, ETSA (Electron Transport System Activity- the activity of oxygen, nitrate, iron/manganese, and sulfate), and methane were considered. It was found through examination of the porewater profiles that the edge of diagenesis occurred around 15-20 cm. Secondly, historical and contemporary TOC profiles were compared to find the point at which the profiles were coincident, indicating the depth at which no change has occurred over the (13 year) interval between core collections. This analysis suggested that no diagenesis has occurred in Onondaga Lake sediment below a depth of 15 cm. Finally, the time to 99% mineralization, the t99, was viewed by using a literature estimate of the kinetic rate constant for diagenesis. A t99 of 34 years, or approximately 30 cm of sediment depth, resulted for the slowly decaying carbon fraction. Based on these three lines of evidence , an active sediment layer of 15-20 cm is proposed for Onondaga Lake, corresponding to a time since deposition of 15-20 years. While a large legacy deposit of conditionally-labile organic material remains in the sediments of Onondaga Lake, it becomes clear that preservation, mechanisms that act to shield labile organic carbon from being degraded, protects this material from being mineralized and exerting a demand on the terminal electron acceptors of the lake. This has major implications for management of the lake, as it defines the time course of lake recovery following a reduction in nutrient loading.
Resumo:
Rather than discarding Clausewitz’s theory of war in response to the revolutionary changes in modern warfare, this article articulates a broader theory of war based on his concept of the “wondrous trinity,” identifying it as his true legacy. The author shows that the concept of trinitarian war attributed to Clausewitz by his critics, which seems to be applicable only to wars between states, is a caricature of Clausewitz’s theory. He goes on to develop Clausewitz’s theory that war is composed of the three tendencies of violence/force, fighting, and the affiliation of the combatants to a warring community. Each war can be analyzed as being composed of these three tendencies and their opposites.
Resumo:
by Sir George Adam Smith ... ; planned by ... Israel Abrahams and ed. by Edwyn R. Bevan ... With an introduction by the master of Balliol
Resumo:
NA61/SHINE (SPS Heavy Ion and Neutrino Experiment) is a multi-purpose experimental facility to study hadron production in hadron-proton, hadron-nucleus and nucleus-nucleus collisions at the CERN Super Proton Synchrotron. It recorded the first physics data with hadron beams in 2009 and with ion beams (secondary 7Be beams) in 2011. NA61/SHINE has greatly profited from the long development of the CERN proton and ion sources and the accelerator chain as well as the H2 beamline of the CERN North Area. The latter has recently been modified to also serve as a fragment separator as needed to produce the Be beams for NA61/SHINE. Numerous components of the NA61/SHINE set-up were inherited from its predecessors, in particular, the last one, the NA49 experiment. Important new detectors and upgrades of the legacy equipment were introduced by the NA61/SHINE Collaboration. This paper describes the state of the NA61/SHINE facility — the beams and the detector system — before the CERN Long Shutdown I, which started in March 2013.
Resumo:
There is general agreement that banking supervision and resolution have to be organised at the same level. It is often argued, however, that there is no need to tackle deposit insurance because it is too politically sensitive. This note proposes to apply the principles of subsidiarity and re-insurance to deposit insurance: Existing national deposit guarantee schemes (DGSs) would continue to operate much as before (with only minimal standards set by an EU directive), but they would be required to take out re-insurance against risks that would be too large to be covered by them. A European Reinsurance Fund (EReIF) would provide this reinsurance financed by premia paid by the national DGSs, just as any reinsurance company does in the private sector. The European Fund would pay out only in case of large losses. This ‘deductible’ would provide the national authorities with the proper incentives, but the reinsurance cover would stabilize depositor confidence even in the case of large shocks. Ideally the national DGSs would be responsible also for resolution. Experience has shown banking systems are more stable if deposit insurers are also responsible for resolution. The approach proposed here could thus be also used to design the ‘Single Resolution Mechanism’ (SRM) which is being discussed as a complement to the ‘Single Supervisory Mechanism’ (SSM). It will of course take time to build up the funding for such a reinsurance fund. This approach is thus not meant to deal with legacy problems from the current crisis.
Resumo:
This paper concentrates on the Nixon-Kissinger view of European political integration. In contrast with the mainstream position of the American Administrations during the 1950s and 1960s, Kissinger was convinced that by encouraging European unity, the United States was in fact creating its own rival. The start of a new system of European foreign policy cooperation in 1970 was seen by Kissinger as a particularly important example of Europe’s attempt to challenge the American hegemony. Kissinger emphasized the need to maintain Western Europe in a subordinate role. Three main lines of action were pursued to keep the development of the European Community under control: maintaining bilateral contacts with key European allies, requesting a seat at the Community's decision-making table, and linking "obedient" European behavior to American military presence in Europe. The legacy of this policy still seems to influence the current American policy on the European Union. The Nixon-Kissinger term was, however, detrimental to rather than conducive of harmonious transatlantic relations. Tendencies to emulate it should therefore be discouraged.
Resumo:
The high hopes for rapid convergence of Eastern and Southern EU member states are increasingly being disappointed. With the onset of the Eurocrisis convergence has given way to divergence in the southern members, and many Eastern members have made little headway in closing the development gap. The EU´s performance compares unfavourably with East Asian success cases as well as with Western Europe´s own rapid catch-up to the USA after 1945. Historical experience indicates that successful catch up requires that less-developed economies to some extent are allowed to free-ride on an open international economic order. However, the EU´s model is based on the principle of a level-playing field, which militates against such a form of economic integration. The EU´s developmental model thus contrasts with the various strategies that have enabled successful catch up of industrial latecomers. Instead the EU´s current approach is more and more reminiscent of the relations between the pre-1945 European empires and their dependent territories. One reason for this unfortunate historical continuity is that the EU appears to have become entangled in its own myths. In the EU´s own interpretation, European integration is a peace project designed to overcome the almost continuous warfare that characterised the Westphalian system. As the sovereign state is identified as the root cause of all evil, any project to curtail its room of manoeuvre must ultimately benefit the common good. Yet, the existence of a Westphalian system of nation states is a myth. Empires and not states were the dominant actors in the international system for at least the last three centuries. If anything, the dawn of the age of the sovereign state in Western Europe occurred after 1945 with the disintegration of the colonial empires and thus historically coincided with the birth of European integration.
Resumo:
The problem of asset price bubbles, and more generally of instability in the financial system, has been a matter of concern since the 1980s but has only recently moved to the center of the macroeconomic policy debate. The main concern with bubbles arises when they burst, imposing losses on investors holding the bubble assets and potentially on the financial institutions that have extended credit to them. Asset price volatility is an inevitable consequence of financial market liberalization and, in extreme cases, generates asset price bubbles, the bursting of which can impose substantial economic and social costs. Policy responses within the existing liberalized financial system face daunting levels of uncertainty and risk. Given the pattern of increasing asset market volatility over recent decades and the policy issues highlighted in this paper, the future looks uncertain. Another significant cycle of asset price movements, especially in one of the major economies, could see a fundamental revision of thinking about the costs and benefits of liberalized financial systems.
Resumo:
In this paper, we present experimental results for monitoring long distance WDM communication links using a line monitoring system suitable for legacy optically amplified long-haul undersea systems. This monitoring system is based on setting up a simple, passive, low cost high-loss optical loopback circuit at each repeater that provides a connection between the existing anti-directional undersea fibres, and can be used to define fault location. Fault location is achieved by transmitting a short pulse supervisory signal along with the WDM data signals where a portion of the overall signal is attenuated and returned to the transmit terminal by the loopback circuit. A special receiver is used at the terminal to extract the weakly returned supervisory signal where each supervisory signal is received at different times corresponding to different optical repeaters. Therefore, the degradation in any repeater appears on its corresponding supervisory signal level. We use a recirculating loop to simulate a 4600 km fibre link, on which a high-loss loopback supervisory system is implemented. Successful monitoring is accomplished through the production of an appropriate supervisory signal at the terminal that is detected and identified in a satisfactory time period after passing through up to 45 dB attenuation in the loopback circuit. © 2012 Elsevier B.V. All rights reserved.
Resumo:
Half a decade has passed since the objectives and benefits of autonomic computing were stated, yet even the latest system designs and deployments exhibit only limited and isolated elements of autonomic functionality. From an autonomic computing standpoint, all computing systems – old, new or under development – are legacy systems, and will continue to be so for some time to come. In this paper, we propose a generic architecture for developing fully-fledged autonomic systems out of legacy, non-autonomic components, and we investigate how existing technologies can be used to implement this architecture.
Resumo:
Despite expectations being high, the industrial take-up of Semantic Web technologies in developing services and applications has been slower than expected. One of the main reasons is that many legacy systems have been developed without considering the potential of theWeb in integrating services and sharing resources.Without a systematic methodology and proper tool support, the migration from legacy systems to SemanticWeb Service-based systems can be a tedious and expensive process, which carries a significant risk of failure. There is an urgent need to provide strategies, allowing the migration of legacy systems to Semantic Web Services platforms, and also tools to support such strategies. In this paper we propose a methodology and its tool support for transitioning these applications to Semantic Web Services, which allow users to migrate their applications to Semantic Web Services platforms automatically or semi-automatically. The transition of the GATE system is used as a case study. © 2009 - IOS Press and the authors. All rights reserved.
Resumo:
In this paper, we present experimental results for monitoring long distance WDM communication links using a line monitoring system suitable for legacy optically amplified long-haul undersea systems. This monitoring system is based on setting up a simple, passive, low cost high-loss optical loopback circuit at each repeater that provides a connection between the existing anti-directional undersea fibres, and can be used to define fault location. Fault location is achieved by transmitting a short pulse supervisory signal along with the WDM data signals where a portion of the overall signal is attenuated and returned to the transmit terminal by the loopback circuit. A special receiver is used at the terminal to extract the weakly returned supervisory signal where each supervisory signal is received at different times corresponding to different optical repeaters. Therefore, the degradation in any repeater appears on its corresponding supervisory signal level. We use a recirculating loop to simulate a 4600 km fibre link, on which a high-loss loopback supervisory system is implemented. Successful monitoring is accomplished through the production of an appropriate supervisory signal at the terminal that is detected and identified in a satisfactory time period after passing through up to 45 dB attenuation in the loopback circuit. © 2012 Elsevier B.V. All rights reserved.