998 resultados para Legacy Systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core systems provide a great potential in application performance with the massively parallel structure. Such systems are currently being integrated into most parts of daily life from high-end server farms to desktop systems, laptops and mobile devices. Yet, these systems are facing increasing challenges such as high temperature causing physical damage, high electrical bills both for servers and individual users, unpleasant noise levels due to active cooling and unrealistic battery drainage in mobile devices; factors caused directly by poor energy efficiency. Power management has traditionally been an area of research providing hardware solutions or runtime power management in the operating system in form of frequency governors. Energy awareness in application software is currently non-existent. This means that applications are not involved in the power management decisions, nor does any interface between the applications and the runtime system to provide such facilities exist. Power management in the operating system is therefore performed purely based on indirect implications of software execution, usually referred to as the workload. It often results in over-allocation of resources, hence power waste. This thesis discusses power management strategies in many-core systems in the form of increasing application software awareness of energy efficiency. The presented approach allows meta-data descriptions in the applications and is manifested in two design recommendations: 1) Energy-aware mapping 2) Energy-aware execution which allow the applications to directly influence the power management decisions. The recommendations eliminate over-allocation of resources and increase the energy efficiency of the computing system. Both recommendations are fully supported in a provided interface in combination with a novel power management runtime system called Bricktop. The work presented in this thesis allows both new- and legacy software to execute with the most energy efficient mapping on a many-core CPU and with the most energy efficient performance level. A set of case study examples demonstrate realworld energy savings in a wide range of applications without performance degradation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human activities have been interfering with the natural biogeochemical cycles of trace elements since the ancient civilizations. Although they are inaccessible and remote, high mountain lake catchments are irrefutably trace-element contaminated by anthropogenic emissions, which can travel by long-range atmospheric transport before they are deposited. This has been revealed by several natural archives. High mountain lake catchments are thus excellent sentinels of long-range contamination. Continuous accumulation can lead to a build up of potentially toxic trace elements in these remote, or relatively remote, ecosystems. The thesis focuses on the biogeochemistry of a suite of trace elements of environmental concern (Ni, Cu, Zn, As, Se, Cd and Pb) in Pyrenean lake catchments, with special emphasis on discerning the “natural” components from the “anthropogenic” contributions. Five other metallic elements (Al, Fe, Ti, Mn and Zr) have also been studied to trace natural fluxes and biogeochemical processes within the lake catchment systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a case study of analyzing a legacy PL/1 ecosystem that has grown for 40 years to support the business needs of a large banking company. In order to support the stakeholders in analyzing it we developed St1-PL/1 — a tool that parses the code for association data and computes structural metrics which it then visualizes using top-down interactive exploration. Before building the tool and after demonstrating it to stakeholders we conducted several interviews to learn about legacy ecosystem analysis requirements. We briefly introduce the tool and then present results of analysing the case study. We show that although the vision for the future is to have an ecosystem architecture in which systems are as decoupled as possible the current state of the ecosystem is still removed from this. We also present some of the lessons learned during our experience discussions with stakeholders which include their interests in automatically assessing the quality of the legacy code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates the performance of the most popular power saving mechanisms defined in the IEEE 802.11 standard, namely the Power Save Mode (Legacy-PSM) and the Unscheduled Automatic Power Save Delivery (U-APSD). The assessment comprises a detailed study concerning energy efficiency and capability to guarantee the required Quality of Service (QoS) for a certain application. The results, obtained in the OMNeT++ simulator, showed that U-APSD is more energy efficient than Legacy-PSM without compromising the end-to- end delay. Both U-APSD and Legacy-PSM revealed capability to guarantee the application QoS requirements in all the studied scenarios. However, unlike U-APSD, when Legacy-PSM is used in the presence of QoS demanding applications, all the stations connected to the network through the same access point will consume noticeable additional energy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite more than two decades of transition from a centrally planned to a market-oriented economy, Myanmar’s economic transition is still only partly complete. The government’s initial strategy for dealing with the swelling deficits of the state economic enterprises (SEEs) was to put them under direct control in order to scrutinize their expenditures. This policy change postponed restructuring and exacerbated the soft budget constraint problem of the SEEs. While the installation of a new government in March 2011 has increased prospects for economic development, sustainable growth still requires full-scale structural reform of the SEEs and institutional infrastructure building. Myanmar can learn from the gradual approaches to economic transition in China and Vietnam, where partial reforms weakened further impetus for reforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-quality software, delivered on time and budget, constitutes a critical part of most products and services in modern society. Our government has invested billions of dollars to develop software assets, often to redevelop the same capability many times. Recognizing the waste involved in redeveloping these assets, in 1992 the Department of Defense issued the Software Reuse Initiative. The vision of the Software Reuse Initiative was "To drive the DoD software community from its current "re-invent the software" cycle to a process-driven, domain-specific, architecture-centric, library-based way of constructing software.'' Twenty years after issuing this initiative, there is evidence of this vision beginning to be realized in nonembedded systems. However, virtually every large embedded system undertaken has incurred large cost and schedule overruns. Investigations into the root cause of these overruns implicates reuse. Why are we seeing improvements in the outcomes of these large scale nonembedded systems and worse outcomes in embedded systems? This question is the foundation for this research. The experiences of the Aerospace industry have led to a number of questions about reuse and how the industry is employing reuse in embedded systems. For example, does reuse in embedded systems yield the same outcomes as in nonembedded systems? Are the outcomes positive? If the outcomes are different, it may indicate that embedded systems should not use data from nonembedded systems for estimation. Are embedded systems using the same development approaches as nonembedded systems? Does the development approach make a difference? If embedded systems develop software differently from nonembedded systems, it may mean that the same processes do not apply to both types of systems. What about the reuse of different artifacts? Perhaps there are certain artifacts that, when reused, contribute more or are more difficult to use in embedded systems. Finally, what are the success factors and obstacles to reuse? Are they the same in embedded systems as in nonembedded systems? The research in this dissertation is comprised of a series of empirical studies using professionals in the aerospace and defense industry as its subjects. The main focus has been to investigate the reuse practices of embedded systems professionals and nonembedded systems professionals and compare the methods and artifacts used against the outcomes. The research has followed a combined qualitative and quantitative design approach. The qualitative data were collected by surveying software and systems engineers, interviewing senior developers, and reading numerous documents and other studies. Quantitative data were derived from converting survey and interview respondents' answers into coding that could be counted and measured. From the search of existing empirical literature, we learned that reuse in embedded systems are in fact significantly different from nonembedded systems, particularly in effort in model based development approach and quality where the development approach was not specified. The questionnaire showed differences in the development approach used in embedded projects from nonembedded projects, in particular, embedded systems were significantly more likely to use a heritage/legacy development approach. There was also a difference in the artifacts used, with embedded systems more likely to reuse hardware, test products, and test clusters. Nearly all the projects reported using code, but the questionnaire showed that the reuse of code brought mixed results. One of the differences expressed by the respondents to the questionnaire was the difficulty in reuse of code for embedded systems when the platform changed. The semistructured interviews were performed to tell us why the phenomena in the review of literature and the questionnaire were observed. We asked respected industry professionals, such as senior fellows, fellows and distinguished members of technical staff, about their experiences with reuse. We learned that many embedded systems used heritage/legacy development approaches because their systems had been around for many years, before models and modeling tools became available. We learned that reuse of code is beneficial primarily when the code does not require modification, but, especially in embedded systems, once it has to be changed, reuse of code yields few benefits. Finally, while platform independence is a goal for many in nonembedded systems, it is certainly not a goal for the embedded systems professionals and in many cases it is a detriment. However, both embedded and nonembedded systems professionals endorsed the idea of platform standardization. Finally, we conclude that while reuse in embedded systems and nonembedded systems is different today, they are converging. As heritage embedded systems are phased out, models become more robust and platforms are standardized, reuse in embedded systems will become more like nonembedded systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of asset price bubbles, and more generally of instability in the financial system, has been a matter of concern since the 1980s but has only recently moved to the center of the macroeconomic policy debate. The main concern with bubbles arises when they burst, imposing losses on investors holding the bubble assets and potentially on the financial institutions that have extended credit to them. Asset price volatility is an inevitable consequence of financial market liberalization and, in extreme cases, generates asset price bubbles, the bursting of which can impose substantial economic and social costs. Policy responses within the existing liberalized financial system face daunting levels of uncertainty and risk. Given the pattern of increasing asset market volatility over recent decades and the policy issues highlighted in this paper, the future looks uncertain. Another significant cycle of asset price movements, especially in one of the major economies, could see a fundamental revision of thinking about the costs and benefits of liberalized financial systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The international nuclear community continues to face the challenge of managing both the legacy waste and the new wastes that emerge from ongoing energy production. The UK is in the early stages of proposing a new convention for its nuclear industry, that is: waste minimisation through closely managing the radioactive source which creates the waste. This paper proposes a new technique (called waste and source material operability study (WASOP)) to qualitatively analyse a complex, waste-producing system to minimise avoidable waste and thus increase the protection to the public and the environment. Design/methodology/approach – WASOP critically considers the systemic impact of up and downstream facilities on the minimisation of nuclear waste in a facility. Based on the principles of HAZOP, the technique structures managers' thinking on the impact of mal-operations in interlinking facilities in order to identify preventative actions to reduce the impact on waste production of those mal-operations.' Findings – WASOP was tested with a small group of experienced nuclear regulators and was found to support their qualitative examination of waste minimisation and help them to work towards developing a plan of action. Originality/value – Given the newness of this convention, the wider methodology in which WASOP sits is still in development. However, this paper communicates the latest thinking from nuclear regulators on decision-making methodology for supporting waste minimisation and is hoped to form part of future regulatory guidance. WASOP is believed to have widespread potential application to the minimisation of many other forms of waste, including that from other energy sectors and household/general waste.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This poster presentation from the May 2015 Florida Library Association Conference, along with the Everglades Explorer discovery portal at http://ee.fiu.edu, demonstrates how traditional bibliographic and curatorial principles can be applied to: 1) selection, cross-walking and aggregation of metadata linking end-users to wide-spread digital resources from multiple silos; 2) harvesting of select PDFs, HTML and media for web archiving and access; 3) selection of CMS domains, sub-domains and folders for targeted searching using an API. Choosing content for this discovery portal is comparable to past scholarly practice of creating and publishing subject bibliographies, except metadata and data are housed in relational databases. This new and yet traditional capacity coincides with: Growth of bibliographic utilities (MarcEdit); Evolution of open-source discovery systems (eXtensible Catalog); Development of target-capable web crawling and archiving systems (Archive-it); and specialized search APIs (Google). At the same time, historical and technical changes – specifically the increasing fluidity and re-purposing of syndicated metadata – make this possible. It equally stems from the expansion of freely accessible digitized legacy and born-digital resources. Innovation principles helped frame the process by which the thematic Everglades discovery portal was created at Florida International University. The path -- to providing for more effective searching and co-location of digital scientific, educational and historical material related to the Everglades -- is contextualized through five concepts found within Dyer and Christensen’s “The Innovator’s DNA: Mastering the five skills of disruptive innovators (2011). The project also aligns with Ranganathan’s Laws of Library Science, especially the 4th Law -- to "save the time of the user.”

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and maintenance of the sealing of the root canal system is the key to the success of root canal treatment. The resin-based adhesive material has the potential to reduce the microleakage of the root canal because of its adhesive properties and penetration into dentinal walls. Moreover, the irrigation protocols may have an influence on the adhesiveness of resin-based sealers to root dentin. The objective of the present study was to evaluate the effect of different irrigant protocols on coronal bacterial microleakage of gutta-percha/AH Plus and Resilon/Real Seal Self-etch systems. One hundred ninety pre-molars were used. The teeth were divided into 18 experimental groups according to the irrigation protocols and filling materials used. The protocols used were: distilled water; sodium hypochlorite (NaOCl)+eDTA; NaOCl+H3PO4; NaOCl+eDTA+chlorhexidine (CHX); NaOCl+H3PO4+CHX; CHX+eDTA; CHX+ H3PO4; CHX+eDTA+CHX and CHX+H3PO4+CHX. Gutta-percha/AH Plus or Resilon/Real Seal Se were used as root-filling materials. The coronal microleakage was evaluated for 90 days against Enterococcus faecalis. Data were statistically analyzed using Kaplan-Meier survival test, Kruskal-Wallis and Mann-Whitney tests. No significant difference was verified in the groups using chlorhexidine or sodium hypochlorite during the chemo-mechanical preparation followed by eDTA or phosphoric acid for smear layer removal. The same results were found for filling materials. However, the statistical analyses revealed that a final flush with 2% chlorhexidine reduced significantly the coronal microleakage. A final flush with 2% chlorhexidine after smear layer removal reduces coronal microleakage of teeth filled with gutta-percha/AH Plus or Resilon/Real Seal SE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate the effectiveness of Reciproc for the removal of cultivable bacteria and endotoxins from root canals in comparison with multifile rotary systems. The root canals of forty human single-rooted mandibular pre-molars were contaminated with an Escherichia coli suspension for 21 days and randomly assigned to four groups according to the instrumentation system: GI - Reciproc (VDW); GII - Mtwo (VDW); GIII - ProTaper Universal (Dentsply Maillefer); and GIV -FKG Race(™) (FKG Dentaire) (n = 10 per group). Bacterial and endotoxin samples were taken with a sterile/apyrogenic paper point before (s1) and after instrumentation (s2). Culture techniques determined the colony-forming units (CFU) and the Limulus Amebocyte Lysate assay was used for endotoxin quantification. Results were submitted to paired t-test and anova. At s1, bacteria and endotoxins were recovered in 100% of the root canals investigated (40/40). After instrumentation, all systems were associated with a highly significant reduction of the bacterial load and endotoxin levels, respectively: GI - Reciproc (99.34% and 91.69%); GII - Mtwo (99.86% and 83.11%); GIII - ProTaper (99.93% and 78.56%) and GIV - FKG Race(™) (99.99% and 82.52%) (P < 0.001). No statistical difference were found amongst the instrumentation systems regarding bacteria and endotoxin removal (P > 0.01). The reciprocating single file, Reciproc, was as effective as the multifile rotary systems for the removal of bacteria and endotoxins from root canals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to evaluate by photoelastic analysis stress distribution on short and long implants of two dental implant systems with 2-unit implant-supported fixed partial prostheses of 8 mm and 13 mm heights. Sixteen photoelastic models were divided into 4 groups: I: long implant (5 × 11 mm) (Neodent), II: long implant (5 × 11 mm) (Bicon), III: short implant (5 × 6 mm) (Neodent), and IV: short implants (5 × 6 mm) (Bicon). The models were positioned in a circular polariscope associated with a cell load and static axial (0.5 Kgf) and nonaxial load (15°, 0.5 Kgf) were applied to each group for both prosthetic crown heights. Three-way ANOVA was used to compare the factors implant length, crown height, and implant system (α = 0.05). The results showed that implant length was a statistically significant factor for both axial and nonaxial loading. The 13 mm prosthetic crown did not result in statistically significant differences in stress distribution between the implant systems and implant lengths studied, regardless of load type (P > 0.05). It can be concluded that short implants showed higher stress levels than long implants. Implant system and length was not relevant factors when prosthetic crown height were increased.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

After a long incubation period, the Intergovernmental Platform on Biodiversity and Ecosystem Services (IPBES) is now underway. Underpinning all its activities is the IPBES Conceptual Framework (CF), a simplified model of the interactions between nature and people. Drawing on the legacy of previous large-scale environmental assessments, the CF goes further in explicitly embracing different disciplines and knowledge systems (including indigenous and local knowledge) in the co-construction of assessments of the state of the world's biodiversity and the benefits it provides to humans. The CF can be thought of as a kind of Rosetta Stone that highlights commonalities between diverse value sets and seeks to facilitate crossdisciplinary and crosscultural understanding. We argue that the CF will contribute to the increasing trend towards interdisciplinarity in understanding and managing the environment. Rather than displacing disciplinary science, however, we believe that the CF will provide new contexts of discovery and policy applications for it.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated the dentine bond strength (BS) and the antibacterial activity (AA) of six adhesives against strict anaerobic and facultative bacteria. Three adhesives containing antibacterial components (Gluma 2Bond (glutaraldehyde)/G2B, Clearfil SE Protect (MDPB)/CSP and Peak Universal Bond (PUB)/chlorhexidine) and the same adhesive versions without antibacterial agents (Gluma Comfort Bond/GCB, Clearfil SE Bond/CSB and Peak LC Bond/PLB) were tested. The AA of adhesives and control groups was evaluated by direct contact method against four strict anaerobic and four facultative bacteria. After incubation, according to the appropriate periods of time for each microorganism, the time to kill microorganisms was measured. For BS, the adhesives were applied according to manufacturers' recommendations and teeth restored with composite. Teeth (n=10) were sectioned to obtain bonded beams specimens, which were tested after artificial saliva storage for one week and one year. BS data were analyzed using two-way ANOVA and Tukey test. Saliva storage for one year reduces the BS only for GCB. In general G2B and GCB required at least 24h for killing microorganisms. PUB and PLB killed only strict anaerobic microorganisms after 24h. For CSP the average time to eliminate the Streptococcus mutans and strict anaerobic oral pathogens was 30min. CSB showed no AA against facultative bacteria, but had AA against some strict anaerobic microorganisms. Storage time had no effect on the BS for most of the adhesives. The time required to kill bacteria depended on the type of adhesive and never was less than 10min. Most of the adhesives showed stable bond strength after one year and the Clearfil SE Protect may be a good alternative in restorative procedures performed on dentine, considering its adequate bond strength and better antibacterial activity.