969 resultados para help system
Resumo:
In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.
Resumo:
In their discussion - Database System for Alumni Tracking - by Steven Moll, Associate Professor and William O'Brien, Assistant Professor, School of Hospitality Management at Florida International University, Professors Moll and O’Brien initially state: “The authors describe a unique database program which was created to solve problems associated with tracking hospitality majors subsequent to graduation.” “…and please, whatever you do, keep in touch with your school; join an alum’ organization. It is a great way to engage the resources of your school to help further your career,” says Professor Claudia Castillo in addressing a group of students attending her Life after College seminar on 9/18/2009. This is a very good point and it is obviously germane to the article at hand. “One of the greatest strengths of a hospitality management school, a strength that grows with each passing year, is its body of alumni,” say the authors. “Whether in recruiting new students or placing graduates, whether in fund raising or finding scholarship recipients, whatever the task, the network of loyal alumni stands ready to help.” The caveat is the resources are only available if students and school, faculty and alumni can keep track of each other, say professors Moll and O’Brien. The authors want you to know that the practice is now considered essential to success, especially in the hospitality industry whereby the fluid nature of the industry makes networking de rigueur to accomplishment. “When the world was a smaller, slower place, it was fairly easy for graduates to keep track of each other; there weren't that many graduates and they didn't move that often,” say the authors. “Now the hospitality graduate enters an international job market and may move five times in the first four years of employment,” they expand that thought. In the contemporary atmosphere linking human resources from institution to marketplace is relatively easy to do. “How can an association keep track of its graduates? There are many techniques, but all of them depend upon adequate recordkeeping,” Moll and O’Brien answer their own query. “A few years ago that would have meant a group of secretaries; today it means a database system,” they say. Moll and O’Brien discuss the essentials of compiling/programming such a comprehensive data base; the body of information to include, guidelines on the problems encountered, and how to avoid the pitfalls. They use the Florida International University, Hospitality database as a template for their example.
Resumo:
Since the establishment of the evaluation system in 1975, the junior colleges in the Republic of China (Taiwan), have gone through six formal evaluations. We know that evaluation in schooling, like quality control in businesses, should be a systematic, formal, and a continual process. It can doubtless serve as a strategy to refine the quality of education. The purpose of this research is to explore the current practice of junior college evaluation in Taiwan. This provides insight into the development of and quality of the current evaluation system. Moreover, this study also identified the source of problems with the current evaluation system and provided suggestion for improvements.^ In order to attain the above purposes, this research was undertaken in both theoretical and practical ways. First, theoretically, on the basis of a literature review, the theories of educational evaluation and, according to the course and principles of development, a view of the current practice in Taiwan. Secondly, in practice, by means of questionnaires, an analysis of the views of evaluation committeemen, junior college presidents, and administrators were obtained on evaluation models, methods, contents, organization, functions, criteria, grades reports, and others with suggestions for improvement. The summary of findings concludes that most evaluators and evaluatees think the purpose of evaluation can help the colleges explore their difficulties and problems. In addition, it was found that there is significant difference between the two groups regarding the evaluation methods, contents, organization, functions, criteria, grades reports and others, while analyzing these objective data forms the basis for an improved method of evaluation for Junior Colleges in Taiwan. ^
Resumo:
Modern IT infrastructures are constructed by large scale computing systems and administered by IT service providers. Manually maintaining such large computing systems is costly and inefficient. Service providers often seek automatic or semi-automatic methodologies of detecting and resolving system issues to improve their service quality and efficiency. This dissertation investigates several data-driven approaches for assisting service providers in achieving this goal. The detailed problems studied by these approaches can be categorized into the three aspects in the service workflow: 1) preprocessing raw textual system logs to structural events; 2) refining monitoring configurations for eliminating false positives and false negatives; 3) improving the efficiency of system diagnosis on detected alerts. Solving these problems usually requires a huge amount of domain knowledge about the particular computing systems. The approaches investigated by this dissertation are developed based on event mining algorithms, which are able to automatically derive part of that knowledge from the historical system logs, events and tickets. ^ In particular, two textual clustering algorithms are developed for converting raw textual logs into system events. For refining the monitoring configuration, a rule based alert prediction algorithm is proposed for eliminating false alerts (false positives) without losing any real alert and a textual classification method is applied to identify the missing alerts (false negatives) from manual incident tickets. For system diagnosis, this dissertation presents an efficient algorithm for discovering the temporal dependencies between system events with corresponding time lags, which can help the administrators to determine the redundancies of deployed monitoring situations and dependencies of system components. To improve the efficiency of incident ticket resolving, several KNN-based algorithms that recommend relevant historical tickets with resolutions for incoming tickets are investigated. Finally, this dissertation offers a novel algorithm for searching similar textual event segments over large system logs that assists administrators to locate similar system behaviors in the logs. Extensive empirical evaluation on system logs, events and tickets from real IT infrastructures demonstrates the effectiveness and efficiency of the proposed approaches.^
Resumo:
This thesis research describes the design and implementation of a Semantic Geographic Information System (GIS) and the creation of its spatial database. The database schema is designed and created, and all textual and spatial data are loaded into the database with the help of the Semantic DBMS's Binary Database Interface currently being developed at the FIU's High Performance Database Research Center (HPDRC). A friendly graphical user interface is created together with the other main system's areas: displaying process, data animation, and data retrieval. All these components are tightly integrated to form a novel and practical semantic GIS that has facilitated the interpretation, manipulation, analysis, and display of spatial data like: Ocean Temperature, Ozone(TOMS), and simulated SeaWiFS data. At the same time, this system has played a major role in the testing process of the HPDRC's high performance and efficient parallel Semantic DBMS.
Resumo:
This paper deals with a very important issue in any knowledge engineering discipline: the accurate representation and modelling of real life data and its processing by human experts. The work is applied to the GRiST Mental Health Risk Screening Tool for assessing risks associated with mental-health problems. The complexity of risk data and the wide variations in clinicians' expert opinions make it difficult to elicit representations of uncertainty that are an accurate and meaningful consensus. It requires integrating each expert's estimation of a continuous distribution of uncertainty across a range of values. This paper describes an algorithm that generates a consensual distribution at the same time as measuring the consistency of inputs. Hence it provides a measure of the confidence in the particular data item's risk contribution at the input stage and can help give an indication of the quality of subsequent risk predictions. © 2010 IEEE.
Resumo:
The main goal of this work is to determine the true cost incurred by the Republic of Ireland and Northern Ireland in order to meet their EU renewable electricity targets. The primary all-island of Ireland policy goal is that 40% of electricity will come from renewable sources in 2020. From this it is expected that wind generation on the Irish electricity system will be in the region of 32-37% of total generation. This leads to issues resulting from wind energy being a non-synchronous, unpredictable and variable source of energy use on a scale never seen before for a single synchronous system. If changes are not made to traditional operational practices, the efficient running of the electricity system will be directly affected by these issues in the coming years. Using models of the electricity system for the all-island grid of Ireland, the effects of high wind energy penetration expected to be present in 2020 are examined. These models were developed using a unit commitment, economic dispatch tool called PLEXOS which allows for a detailed representation of the electricity system to be achieved down to individual generator level. These models replicate the true running of the electricity system through use of day-ahead scheduling and semi-relaxed use of these schedules that reflects the Transmission System Operator's of real time decision making on dispatch. In addition, it carefully considers other non-wind priority dispatch generation technologies that have an effect on the overall system. In the models developed, three main issues associated with wind energy integration were selected to be examined in detail to determine the sensitivity of assumptions presented in other studies. These three issues include wind energy's non-synchronous nature, its variability and spatial correlation, and its unpredictability. This leads to an examination of the effects in three areas: the need for system operation constraints required for system security; different onshore to offshore ratios of installed wind energy; and the degrees of accuracy in wind energy forecasting. Each of these areas directly impact the way in which the electricity system is run as they address each of the three issues associated with wind energy stated above, respectively. It is shown that assumptions in these three areas have a large effect on the results in terms of total generation costs, wind curtailment and generator technology type dispatch. In particular accounting for these issues has resulted in wind curtailment being predicted in much larger quantities than had been previously reported. This would have a large effect on wind energy companies because it is already a very low profit margin industry. Results from this work have shown that the relaxation of system operation constraints is crucial to the economic running of the electricity system with large improvements shown in the reduction of wind curtailment and system generation costs. There are clear benefits in having a proportion of the wind installed offshore in Ireland which would help to reduce variability of wind energy generation on the system and therefore reduce wind curtailment. With envisaged future improvements in day-ahead wind forecasting from 8% to 4% mean absolute error, there are potential reductions in wind curtailment system costs and open cycle gas turbine usage. This work illustrates the consequences of assumptions in the areas of system operation constraints, onshore/offshore installed wind capacities and accuracy in wind forecasting to better inform the true costs associated with running Ireland's changing electricity system as it continues to decarbonise into the near future. This work also proposes to illustrate, through the use of Ireland as a case study, the effects that will become ever more prevalent in other synchronous systems as they pursue a path of increasing renewable energy generation.
Resumo:
The focus on how one is behaving, feeling, and thinking, provides a powerful source of self-knowledge. How is this self-knowledge utilized in the dynamic reconstruction of autobiographical memories? How, in turn, might autobiographical memories support identity and the self-system? I address these questions through a critical review of the literature on autobiographical memory and the self-system, with a special focus on the self-concept, self-knowledge, and identity. I then outline the methods and results of a prospective longitudinal study examining the effects of an identity change on memory for events related to that identity. Participant-rated memory characteristics, computer-generated ratings of narrative content and structure, and neutral-observer ratings of coherence were examined for changes over time related to an identity-change, as well as for their ability to predict an identity-change. The conclusions from this study are threefold: (1) when the rated centrality of an event decreases, the reported instances of retrieval, as well as the phenomenology associated with retrieval and the number of words used to describe the memory, also decrease; (2) memory accuracy (here, estimating past behaviors) was not influenced by an identity change; and (3) remembering is not unidirectional – characteristics of identity-relevant memories and the life story predict and may help support persistence with an identity (here, an academic trajectory).
Resumo:
This paper investigates the static and dynamic characteristics of the semi-elliptical rocking disk on which a pendulum pinned. This coupled system’s response is also analyzed analytically and numerically when a vertical harmonic excitation is applied to the bottom of the rocking disk. Lagrange’s Equation is used to derive the motion equations of the disk-pendulum coupled system. The second derivative test for the system’s potential energy shows how the location of the pendulum’s pivotal point affects the number and stability of equilibria, and the change of location presents different bifurcation diagrams for different geometries of the rocking disk. For both vertically excited and unforced cases, the coupled system shows chaos easily, but the proper chosen parameters can still help the system reach and keep the steady state. For the steady state of the vertically excited rocking disk without a pendulum, the variation of the excitation’s amplitude and frequency result in the hysteresis for the amplitude of the response. When a pendulum is pinned on the rocking disk, three major categories of steady states are presently in the numerical way.
Resumo:
Executive summary
Digital systems have transformed, and will continue to transform, our world. Supportive government policy, a strong research base and a history of industrial success make the UK particularly well-placed to realise the benefits of the emerging digital society. These benefits have already been substantial, but they remain at risk. Protecting the benefits and minimising the risks requires reliable and robust cybersecurity, underpinned by a strong research and translation system.
Trust is essential for growing and maintaining participation in the digital society. Organisations earn trust by acting in a trustworthy manner: building systems that are reliable and secure, treating people, their privacy and their data with respect, and providing credible and comprehensible information to help people understand how secure they are.
Resilience, the ability to function, adapt, grow, learn and transform under stress or in the face of shocks, will help organisations deliver systems that are reliable and secure. Resilient organisations can better protect their customers, provide more useful products and services, and earn people’s trust.
Research and innovation in industry and academia will continue to make important contributions to creating this resilient and trusted digital environment. Research can illuminate how best to build, assess and improve digital systems, integrating insights from different disciplines, sectors and around the globe. It can also generate advances to help cybersecurity keep up with the continued evolution of cyber risks.
Translation of innovative ideas and approaches from research will create a strong supply of reliable, proven solutions to difficult to predict cybersecurity risks. This is best achieved by maximising the diversity and number of innovations that see the light of day as products.
Policy, practice and research will all need to adapt. The recommendations made in this report seek to set up a trustworthy, self-improving and resilient digital environment that can thrive in the face of unanticipated threats, and earn the trust people place in it.
Innovation and research will be particularly important to the UK’s economy as it establishes a new relationship with the EU. Cybersecurity delivers important economic benefits, both by underpinning the digital foundations of UK business and trade and also through innovation that feeds directly into growth. The findings of this report will be relevant regardless of how the UK’s relationship to the EU changes.
Headline recommendations
● Trust: Governments must commit to preserving the robustness of encryption, including end-to-end encryption, and promoting its widespread use. Encryption is a foundational security technology that is needed to build user trust, improve security standards and fully realise the benefits of digital systems.
● Resilience: Government should commission an independent review of the UK’s future cybersecurity needs, focused on the institutional structures needed to support resilient and trustworthy digital systems in the medium and longer term. A self-improving, resilient digital environment will need to be guided and governed by institutions that are transparent, expert and have a clear and widely-understood remit.
● Research: A step change in cybersecurity research and practice should be pursued; it will require a new approach to research, focused on identifying ambitious high-level goals and enabling excellent researchers to pursue those ambitions. This would build on the UK's existing strengths in many aspects of cybersecurity research and ultimately help build a resilient and trusted digital sector based on excellent research and world-class expertise.
● Translation: The UK should promote a free and unencumbered flow of cybersecurity ideas from research to practical use and support approaches that have public benefits beyond their short term financial return. The unanticipated nature of future cyber threats means that a diverse set of cybersecurity ideas and approaches will be needed to build resilience and adaptivity. Many of the most valuable ideas will have broad security benefits for the public, beyond any direct financial returns.
Resumo:
The advancements in medical science and technology have proved to be a boon to mankind. At the same time they have raised numerous challenges before the legal systems of the world. One such advancement is that of assisted human reproductive technologies and particularly surrogacy, which have given a new meaning to the concept of procreation. These technologies have made it possible for individuals to beget a genetically related child with the help of a third party and without sexual intercourse. Among all the assisted human reproductive technologies, the practice of surrogacy, in which women agree to have their bodies used to undergo a pregnancy and give birth to a baby for another, has raised various legal and human right controversies and diverse legal responses all over the world. India has particularly become a top destination for individuals who wish to beget a child through surrogacy and hence it is imperative for the Indian government to address the challenges posed by surrogacy. This study is an attempt to examine the need and importance of surrogacy practices and the conflicting legal and human rights issues raised by surrogacy in contemporary times. It also examines the adequacy of existing legal framework in India and attempts to provide pragmatic solutions for regulating surrogacy and protecting the interests of various stakeholders involved in surrogacy.
Resumo:
I explore transformative social innovation in agriculture through a particular case of agroecological innovation, the System of Rice Intensification (SRI) in India. Insights from social innovation theory that emphasize the roles of social movements and the reengagement of vulnerable populations in societal transformation can help reinstate the missing “social” dimension in current discourses on innovation in India. India has a rich and vibrant tradition of social innovation wherein vulnerable communities have engaged in collective experimentation. This is often missed in official or formal accounts. Social innovations such as SRI can help recreate these possibilities for change from outside the mainstream due to newer opportunities that networks present in the twenty-first century. I show how local and international networks led by Civil Society Organizations have reinterpreted and reconstructed game-changing macrotrends in agriculture. This has enabled the articulation and translation of an alternative paradigm for sustainable transitions within agriculture from outside formal research channels. These social innovations, however, encounter stiff opposition from established actors in agricultural research systems. Newer heterogeneous networks, as witnessed in SRI, provide opportunities for researchers within hierarchical research systems to explore, experiment, and create newer norms of engagement with Civil Society Organizations and farmers. I emphasize valuing and embedding diversity of practices and institutions at an early stage to enable systems to be more resilient and adaptable in sustainable transitions.
Resumo:
At present, in large precast concrete enterprises, the management over precast concrete component has been chaotic. Most enterprises take labor-intensive manual input method, which is time consuming and laborious, and error-prone. Some other slightly better enterprises choose to manage through bar-code or printing serial number manually. However, on one hand, this is also labor-intensive, on the other hand, this method is limited by external environment, making the serial number blur or even lost, and also causes a big problem on production traceability and quality accountability. Therefore, to realize the enterprise’s own rapid development and cater to the needs of the time, to achieve the automated production management has been a big problem for a modern enterprise. In order to solve the problem, inefficiency in production and traceability of the products, this thesis try to introduce RFID technology into the production of PHC tubular pile. By designing a production management system of precast concrete components, the enterprise will achieve the control of the entire production process, and realize the informatization of enterprise production management. RFID technology has been widely used in many fields like entrance control, charge management, logistics and so on. RFID technology will adopt passive RFID tag, which is waterproof, shockproof, anti-interference, so it’s suitable for the actual working environment. The tag will be bound to the precast component steel cage (the structure of the PHC tubular pile before the concrete placement), which means each PHC tubular pile will have a unique ID number. Then according to the production procedure, the precast component will be performed with a series of actions, put the steel cage into the mold, mold clamping, pouring concrete (feed), stretching, centrifugalizing, maintenance, mold removing, welding splice. In every session of the procedure, the information of the precast components can be read through a RFID reader. Using a portable smart device connected to the database, the user can check, inquire and management the production information conveniently. Also, the system can trace the production parameter and the person in charge, realize the traceability of the information. This system can overcome the disadvantages in precast components manufacturers, like inefficiency, error-prone, time consuming, labor intensity, low information relevance and so on. This system can help to improve the production management efficiency, and can produce a good economic and social benefits, so, this system has a certain practical value.
Resumo:
Although tyrosine kinase inhibitors (TKIs) such as imatinib have transformed chronic myelogenous leukemia (CML) into a chronic condition, these therapies are not curative in the majority of cases. Most patients must continue TKI therapy indefinitely, a requirement that is both expensive and that compromises a patient's quality of life. While TKIs are known to reduce leukemic cells' proliferative capacity and to induce apoptosis, their effects on leukemic stem cells, the immune system, and the microenvironment are not fully understood. A more complete understanding of their global therapeutic effects would help us to identify any limitations of TKI monotherapy and to address these issues through novel combination therapies. Mathematical models are a complementary tool to experimental and clinical data that can provide valuable insights into the underlying mechanisms of TKI therapy. Previous modeling efforts have focused on CML patients who show biphasic and triphasic exponential declines in BCR-ABL ratio during therapy. However, our patient data indicates that many patients treated with TKIs show fluctuations in BCR-ABL ratio yet are able to achieve durable remissions. To investigate these fluctuations, we construct a mathematical model that integrates CML with a patient's autologous immune response to the disease. In our model, we define an immune window, which is an intermediate range of leukemic concentrations that lead to an effective immune response against CML. While small leukemic concentrations provide insufficient stimulus, large leukemic concentrations actively suppress a patient's immune system, thus limiting it's ability to respond. Our patient data and modeling results suggest that at diagnosis, a patient's high leukemic concentration is able to suppress their immune system. TKI therapy drives the leukemic population into the immune window, allowing the patient's immune cells to expand and eventually mount an efficient response against the residual CML. This response drives the leukemic population below the immune window, causing the immune population to contract and allowing the leukemia to partially recover. The leukemia eventually reenters the immune window, thus stimulating a sequence of weaker immune responses as the two populations approach equilibrium. We hypothesize that a patient's autologous immune response to CML may explain the fluctuations in BCR-ABL ratio that are regularly seen during TKI therapy. These fluctuations may serve as a signature of a patient's individual immune response to CML. By applying our modeling framework to patient data, we are able to construct an immune profile that can then be used to propose patient-specific combination therapies aimed at further reducing a patient's leukemic burden. Our characterization of a patient's anti-leukemia immune response may be especially valuable in the study of drug resistance, treatment cessation, and combination therapy.
Resumo:
Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.