880 resultados para Result oriented management


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a case of a 43-year-old primigravida primipara woman who presented in our Department in 36 weeks gestational age and underwent caesarean section due to preeclampsia. From her history, it was known that her pregnancy was an in vitro fertilization (IVF) result. She also received low molecular weight heparin because of thrombophilia (protein S insufficiency). We present this case of postpartum thrombocytosis and discuss the differential diagnosis of this condition through the presentation of its management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Micropapillary serous borderline tumor of the ovary is characterized by a more frequent association with extraovarian, especially invasive, implants. The aim of this study was to report the clinicopathological findings of a rare case of micropapillary serous borderline tumor of the ovary since there are less than 100 similar cases in the published literature. Additionally, the successful management of evisceration that complicated the postoperative stay of the patient is analyzed. The incidence of this severe complication is estimated between 0.29-2.3%. There are four main causes: suture tearing through the fascia, knot failure, suture failure, and extrusion of abdominal contents between sutures placed too far apart. At least 50% of the cases are due to technical error with a potentially lethal result.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The academic activities carried out at the School of Chemistry make indispensable to develop actions oriented toward the consolidation of a reagent and residue management system, especially in the teaching laboratories. The project “Management of reagents and residues in the teaching laboratories of the School of Chemistry” works under the Green Chemistry values which designs products and chemical processes that reduce or eliminate the use and production of dangerous substances, to benefit the environment. With a preventive vision, a change from the  laboratory practices is looked to select those with less environmental impact. Additionally, residue quantification is made and its management protocols are developed for each practice. The project has several stages: diagnose, action implementation, student, teacher and administration personnel training and evaluation during the process and at the end of it. The article describes methodological aspects of the project operation emphasizing on reagent and residue quantification through flow diagrams.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lake Albert is one of the largest lakes in Uganda that still supports a multi-species fishery which as a result of variable adult sizes of the species, causes management challenges especially in relation to gear mesh size enforcement. Prior to the 1980s, commercial species were 17 large sized fishes especially Citharinus citharinus, Distichodus niloticus and Lates spp. that were confimed to inshore habitats of the lake and were thus rapidly over fished. Frame and catch assessment surveys conducted in this study revealed a >80% dominance of small size fish species (Neobola bredoi and Brycinus nurse) and a 40 -60% decrease in the contribution of the large commercial species. Sustainability of small size fish species is uncertain due to seasonal fluctuations and low beach value. At about 150,000 tons of fish recorded from Lake Albert and Albert Nile, the beach value was estimated at 55.3 million USD. Despite the noted decline in catches of the large sized fishes their contribution was more than 50% of total beach value. Therefore, management measures should couple value addition for the small sized species and maintain effort regulation targeting recovery of the large previously important commercial species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Physical control of water hyacinth consists of removing the plants from the water by hand or machines. It is considered over effetive because it involves removing the whole plants from water. The first attempt on physical control was in 1992 when weed infestation was causing serious problems to the fishing communities in Lake Kyoga. The fishermen had problems of accessing the lake as huge masses of mobile weed blocked landing sites. Furthermore, the fishers lost their nets, which were swept away by mobile water hyacinth. As a result, an integrated control strategy involving physical control (manual and mechanical removal) was put in place. Through this method, the fishers were able to open up access routes to fishing grounds even though weed mats often reblocked the access routes. In the infested lakes, manual removal offered remedial relief to fish Iandings and other access sites. Sites of strategic importance such as hydro-electric power generation dam, water intake points and docking points which had large masses of water hyacinth required heavy machinery and mechanical harvesters were used at these sites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Well-designed marine protected area (MPA) networks can deliver a range of ecological, economic and social benefits, and so a great deal of research has focused on developing spatial conservation prioritization tools to help identify important areas. However, whilst these software tools are designed to identify MPA networks that both represent biodiversity and minimize impacts on stakeholders, they do not consider complex ecological processes. Thus, it is difficult to determine the impacts that proposed MPAs could have on marine ecosystem health, fisheries and fisheries sustainability. Using the eastern English Channel as a case study, this paper explores an approach to address these issues by identifying a series of MPA networks using the Marxan and Marxan with Zones conservation planning software and linking them with a spatially explicit ecosystem model developed in Ecopath with Ecosim. We then use these to investigate potential trade-offs associated with adopting different MPA management strategies. Limited-take MPAs, which restrict the use of some fishing gears, could have positive benefits for conservation and fisheries in the eastern English Channel, even though they generally receive far less attention in research on MPA network design. Our findings, however, also clearly indicate that no-take MPAs should form an integral component of proposed MPA networks in the eastern English Channel, as they not only result in substantial increases in ecosystem biomass, fisheries catches and the biomass of commercially valuable target species, but are fundamental to maintaining the sustainability of the fisheries. Synthesis and applications. Using the existing software tools Marxan with Zones and Ecopath with Ecosim in combination provides a powerful policy-screening approach. This could help inform marine spatial planning by identifying potential conflicts and by designing new regulations that better balance conservation objectives and stakeholder interests. In addition, it highlights that appropriate combinations of no-take and limited-take marine protected areas might be the most effective when making trade-offs between long-term ecological benefits and short-term political acceptability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stakeholder engagement is important for successful management of natural resources, both to make effective decisions and to obtain support. However, in the context of coastal management, questions remain unanswered on how to effectively link decisions made at the catchment level with objectives for marine biodiversity and fisheries productivity. Moreover, there is much uncertainty on how to best elicit community input in a rigorous manner that supports management decisions. A decision support process is described that uses the adaptive management loop as its basis to elicit management objectives, priorities and management options using two case studies in the Great Barrier Reef, Australia. The approach described is then generalised for international interest. A hierarchical engagement model of local stakeholders, regional and senior managers is used. The result is a semi-quantitative generic elicitation framework that ultimately provides a prioritised list of management options in the context of clearly articulated management objectives that has widespread application for coastal communities worldwide. The case studies show that demand for local input and regional management is high, but local influences affect the relative success of both engagement processes and uptake by managers. Differences between case study outcomes highlight the importance of discussing objectives prior to suggesting management actions, and avoiding or minimising conflicts at the early stages of the process. Strong contributors to success are a) the provision of local information to the community group, and b) the early inclusion of senior managers and influencers in the group to ensure the intellectual and time investment is not compromised at the final stages of the process. The project has uncovered a conundrum in the significant gap between the way managers perceive their management actions and outcomes, and community's perception of the effectiveness (and wisdom) of these same management actions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lake Albert/Mobutu lies along the Zaire-Uganda border in 43/57 per cent ratio in the faulted depression tending south-west to the north east. It is bounded by latitudes 1o0 n to 2o 20’ N and longitudes 30o 20’ to 31o 20’E. It has a width varying from 35 to 45 km (22 to 28 miles) as measured between the scarps at the lake level. It covers an area of 5600km2 and has a maximum depth of 48m. The major inflow is through the Semiliki, an outflow of Lake Edward, Muzizi and Victoria Nile draining lakes Victoria and Kyoga while the Albert Nile is the outflow. The physical, chemical and biological productivity parameters are summarized in Table 1. The scarp is steep but not sheer and there are at least 4 tracks leading down it to villages on the shore and scarp land scarp is a young one, formed as a result of earth movements of the Pleistocene times, and the numerous streams come down headlong down its thousand feet drop, more often than not in falls (Baker, 1954). Sometimes there appears to be a clean fault; and at other places there is the appearrence of step faulting, although this may be of only a superical nature .The escarpment’s composed of rocks belonging to the pre-Cambrian Basement complex of the content; but the floor of the depression is covered with young sedimentary rocks, known as kaiso beds. In their upper part these latter beds contains many pebbles; whilst low down the occurrence fossiliferous beds is sufficiently rare phenomenon in the interior plateau of Africa. The kaiso beds dated as possibly middle Pleistocene in age, are exposed in various flats on the shore, and they presumably extend under the relatively shallow waters of the lake. A feature of the shore is the development of sandpits and the enclosure of lagoons; and these can be observed in various stages of development at kaiso, Tonya, kibiro, Buhuka and above all, at Butiaba. On an island lake over 1100 km (700 miles) from the shores of the Indian Ocean one can thus study some of the shore-line phenomena usually associated with the sea- coast (Worthington, 1929). In the north, from Butiaba onwards, the flats become wider and from a continuous lowland as the lake shore curves away from the straight edge of the escarpment. At a height of just 610m (2000 feet) above sea-level, the rift valley floor at Butiaba has a mean annual temperature of 25.60c (780 f), from which there is virtually no seasonal variation; and and the mean daily range is only 6.50c (130f) (E.Afr. met. Dept.1953). With a mean annual rainfall of not much more than 762mm (309 inches) and only 92 rain days in ayear, again to judge from Butiaba, conditions in the rift valley are semi-arid; and the vegetation cover consists of grasses and scattered drought-resisting trees and bushes. Only near the stream courses does the vegetation thicken.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weed management has become increasingly challenging for cotton growers in Australia in the last decade. Glyphosate, the cornerstone of weed management in the industry, is waning in effectiveness as a result of the evolution of resistance in several species. One of these, awnless barnyard grass, is very common in Australian cotton fields, and is a prime example of the new difficulties facing growers in choosing effective and affordable management strategies. RIM (Ryegrass Integrated Management) is a computer-based decision support tool developed for the south-western Australian grains industry. It is commonly used there as a tool for grower engagement in weed management thinking and strategy development. We used RIM as the basis for a new tool that can fulfil the same types of functions for subtropical Australian cotton-grains farming systems. The new tool, BYGUM, provides growers with a robust means to evaluate five-year rotations including testing the economic value of fallows and fallow weed management, winter and summer cropping, cover crops, tillage, different herbicide options, herbicide resistance management, and more. The new model includes several northernregion- specific enhancements: winter and summer fallows, subtropical crop choices, barnyard grass seed bank, competition, and ecology parameters, and more freedom in weed control applications. We anticipate that BYGUM will become a key tool for teaching and driving the changes that will be needed to maintain sound weed management in cotton in the near future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge is one of the most important assets for surviving in the modern business environment. The effective management of that asset mandates continuous adaptation by organizations, and requires employees to strive to improve the company's work processes. Organizations attempt to coordinate their unique knowledge with traditional means as well as in new and distinct ways, and to transform them into innovative resources better than those of their competitors. As a result, how to manage the knowledge asset has become a critical issue for modern organizations, and knowledge management is considered the most feasible solution. Knowledge management is a multidimensional process that identifies, acquires, develops, distributes, utilizes, and stores knowledge. However, many related studies focus only on fragmented or limited knowledge-management perspectives. In order to make knowledge management more effective, it is important to identify the qualitative and quantitative issues that are the foundation of the challenge of effective knowledge management in organizations. The main purpose of this study was to integrate the fragmented knowledge management perspectives into the holistic framework, which includes knowledge infrastructure capability (technology, structure, and culture) and knowledge process capability (acquisition, conversion, application, and protection), based on Gold's (2001) study. Additionally, because the effect of incentives ̶̶ which is widely acknowledged as a prime motivator in facilitating the knowledge management process ̶̶ was missing in the original framework, this study included the importance of incentives in the knowledge management framework. This study also identified the relationship of organizational performance from the standpoint of the Balanced Scorecard, which includes the customer-related, internal business process, learning & growth, and perceptual financial aspects of organizational performance in the Korean business context. Moreover, this study identified the relationship with the objective financial performance by calculating the Tobin's q ratio. Lastly, this study compared the group differences between larger and smaller organizations, and manufacturing and nonmanufacturing firms in the study of knowledge management. Since this study was conducted in Korea, the original instrument was translated into Korean through the back translation technique. A confirmatory factor analysis (CFA) was used to examine the validity and reliability of the instrument. To identify the relationship between knowledge management capabilities and organizational performance, structural equation modeling (SEM) and multiple regression analysis were conducted. A Student's t test was conducted to examine the mean differences. The results of this study indicated that there is a positive relationship between effective knowledge management and organizational performance. However, no empirical evidence was found to suggest that knowledge management capabilities are linked to the objective financial performance, which remains a topic for future review. Additionally, findings showed that knowledge management is affected by organization's size, but not by type of organization. The results of this study are valuable in establishing a valid and reliable survey instrument, as well as in providing strong evidence that knowledge management capabilities are essential to improving organizational performance currently and making important recommendations for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of related research studies over 15 years assessed the effects of prawn trawling on sessile megabenthos in the Great Barrier Reef, to support management for sustainable use in the World Heritage Area. These large-scale studies estimated impacts on benthos (particularly removal rates per trawl pass), monitored subsequent recovery rates, measured natural dynamics of tagged megabenthos, mapped the regional distribution of seabed habitats and benthic species, and integrated these results in a dynamic modelling framework together with spatio-temporal fishery effort data and simulated management. Typical impact rates were between 5 and 25% per trawl, recovery times ranged from several years to several decades, and most sessile megabenthos were naturally distributed in areas where little or no trawling occurred and so had low exposure to trawling. The model simulated trawl impact and recovery on the mapped species distributions, and estimated the regional scale cumulative changes due to trawling as a time series of status for megabenthos species. The regional status of these taxa at time of greatest depletion ranged from ∼77% relative to pre-trawl abundance for the worst case species, having slow recovery with moderate exposure to trawling, to ∼97% for the least affected taxon. The model also evaluated the expected outcomes for sessile megabenthos in response to major management interventions implemented between 1999 and 2006, including closures, effort reductions, and protected areas. As a result of these interventions, all taxa were predicted to recover (by 2-14% at 2025); the most affected species having relatively greater recovery. Effort reductions made the biggest positive contributions to benthos status for all taxa, with closures making smaller contributions for some taxa. The results demonstrated that management actions have arrested and reversed previous unsustainable trends for all taxa assessed, and have led to a prawn trawl fishery with improved environmental sustainability. © 2015 International Council for the Exploration of the Sea 2015. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When multiple third-parties (states, coalitions, and international organizations) intervene in the same conflict, do their efforts inform one another? Anecdotal evidence suggests such a possibility, but research to date has not attempted to model this interdependence directly. The current project breaks with that tradition. In particular, it proposes three competing explanations of how previous intervention efforts affect current intervention decisions: a cost model (and a variant on it, a limited commitments model), a learning model, and a random model. After using a series of Markov transition (regime-switching) models to evaluate conflict management behavior within militarized interstate disputes in the 1946-2001 period, this study concludes that third-party intervention efforts inform one another. More specifically, third-parties examine previous efforts and balance their desire to manage conflict with their need to minimize intervention costs (the cost and limited commitments models). As a result, third-parties intervene regularly using verbal pleas and mediation, but rely significantly less frequently on legal, administrative, or peace operations strategies. This empirical threshold to the intervention costs that third-parties are willing to bear has strong theoretical foundations and holds across different time periods and third-party actors. Furthermore, the analysis indicates that the first third-party to intervene in a conflict is most likely to use a strategy designed to help the disputants work toward a resolution of their dispute. After this initial intervention, the level of third-party involvement declines and often devolves into a series of verbal pleas for peace. Such findings cumulatively suggest that disputants hold the key to effective conflict management. If the disputants adopt and maintain an extreme bargaining position or fail to encourage third-parties to accept greater intervention costs, their dispute will receive little more than verbal pleas for negotiations and peace.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.