970 resultados para Practical problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]This research had as primary objective to model different types of problems using linear programming and apply different methods so as to find an adequate solution to them. To achieve this objective, a linear programming problem and its dual were studied and compared. For that, linear programming techniques were provided and an introduction of the duality theory was given, analyzing the dual problem and the duality theorems. Then, a general economic interpretation was given and different optimal dual variables like shadow prices were studied through the next practical case: An aesthetic surgery hospital wanted to organize its monthly waiting list of four types of surgeries to maximize its daily income. To solve this practical case, we modelled the linear programming problem following the relationships between the primal problem and its dual. Additionally, we solved the dual problem graphically, and then we found the optimal solution of the practical case posed through its dual, following the different theorems of the duality theory. Moreover, how Complementary Slackness can help to solve linear programming problems was studied. To facilitate the solution Solver application of Excel and Win QSB programme were used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Advocates and critics of target-setting in the workplace seem unable to reach beyond their own well-entrenched battle lines. While the advocates of goal-directed behaviour point to what they see as demonstrable advantages, the critics of target-setting highlight equally demonstrable disadvantages. Indeed, the academic literature on this topic is currently mired in controversy, with neither side seemingly capable of envisaging a better way forward. This paper seeks to break the current deadlock and move thinking forward in this important aspect of performance measurement and management by outlining a new, more fruitful approach, based on both theory and practical experience. Design/methodology/approach: The topic was approached in three phases: assembling and reading key academic and other literature on the subject of target-setting and goal-directed behaviour, with a view to understanding, in depth, the arguments advanced by the advocates and critics of target-setting; comparing these published arguments with one's own experiential findings, in order to bring the essence of disagreement into much sharper focus; and then bringing to bear the academic and practical experience to identify the essential elements of a new, more fruitful approach offering all the benefits of goal-directed behaviour with none of the typical disadvantages of target-setting. Findings: The research led to three key findings: the advocates of goal-directed behaviour and critics of target-setting each make valid points, as seen from their own current perspectives; the likelihood of these two communities, left to themselves, ever reaching a new synthesis, seems vanishingly small (with leading thinkers in the goal-directed behaviour community already acknowledging this); and, between the three authors, it was discovered that their unusual combination of academic study and practical experience enabled them to see things differently. Hence, they would like to share their new thinking more widely. Research limitations/implications: The authors fully accept that their paper is informed by extensive practical experience and, as yet, there have been no opportunities to test their findings, conclusions and recommendations through rigorous academic research. However, they hope that the paper will move thinking forward in this arena, thereby informing future academic research. Practical implications: The authors hope that the practical implications of the paper will be significant, as it outlines a novel way for organisations to capture the benefits of goal-directed behaviour with none of the disadvantages typically associated with target-setting. Social implications: Given that increased efficiency and effectiveness in the management of organisations would be good for society, the authors think the paper has interesting social implications. Originality/value: Leading thinkers in the field of goal-directed behaviour, such as Locke and Latham, and leading critics of target-setting, such as Ordóñez et al. continue to argue with one another - much like, at the turn of the nineteenth century, proponents of the "wave theory of light" and proponents of the "particle theory of light" were similarly at loggerheads. Just as this furious scientific debate was ultimately resolved by Taylor's experiment, showing that light could behave both as a particle and wave at the same time, the authors believe that the paper demonstrates that goal-directed behaviour and target-setting can successfully co-exist. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses innovations in curriculum development in the Department of Engineering at the University of Cambridge as a participant in the Teaching for Learning Network (TFLN), a teaching and learning development initiative funded by the Cambridge-MIT Institute a pedagogic collaboration and brokerage network. A year-long research and development project investigated the practical experiences through which students traditionally explore engineering disciplines, apply and extend the knowledge gained in lectures and other settings, and begin to develop their professional expertise. The research project evaluated current practice in these sessions and developed an evidence-base to identify requirements for new activities, student support and staff development. The evidence collected included a novel student 'practice-value' survey highlighting effective practice and areas of concern, classroom observation of practicals, semi-structured interviews with staff, a student focus group and informal discussions with staff. Analysis of the data identified three potentially 'high-leverage' strategies for improvement: development of a more integrated teaching framework, within which practical work could be contextualised in relation to other learning; a more transparent and integrated conceptual framework where theory and practice were more closely linked; development of practical work more reflective of the complex problems facing professional engineers. This paper sets out key elements of the evidence collected and the changes that have been informed by this evidence and analysis, leading to the creation of a suite of integrated practical sessions carefully linked to other course elements and reinforcing central concepts in engineering, accompanied by a training and support programme for teaching staff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Underground space is commonly exploited both to maximise the utility of costly land in urban development and to reduce the vertical load acting on the ground. Deep excavations are carried out to construct various types of underground infrastructure such as deep basements, subways and service tunnels. Although the soil response to excavation is known in principle, designers lack practical calculation methods for predicting both short- and long-term ground movements. As the understanding of how soil behaves around an excavation in both the short and long term is insufficient and usually empirical, the judgements used in design are also empirical and serious accidents are common. To gain a better understanding of the mechanisms involved in soil excavation, a new apparatus for the centrifuge model testing of deep excavations in soft clay has been developed. This apparatus simulates the field construction sequence of a multi-propped retaining wall during centrifuge flight. A comparison is given between the new technique and the previously used method of draining heavy fluid to simulate excavation in a centrifuge model. The new system has the benefit of giving the correct initial ground conditions before excavation and the proper earth pressure distribution on the retaining structures during excavation, whereas heavy fluid only gives an earth pressure coefficient of unity and is unable to capture any changes in the earth pressure coefficient of soil inside the zone of excavation, for example owing to wall movements. Settlements of the ground surface, changes in pore water pressure, variations in earth pressure, prop forces and bending moments in the retaining wall are all monitored during excavation. Furthermore, digital images taken of a cross-section during the test are analysed using particle image velocimetry to illustrate ground deformation and soil–structure interaction mechanisms. The significance of these observations is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the modern engineering design cycle the use of computational tools becomes a neces- sity. The complexity of the engineering systems under consideration for design increases dramatically as the demands for advanced and innovative design concepts and engineering products is expanding. At the same time the advancements in the available technology in terms of computational resources and power, as well as the intelligence of the design software, accommodate these demands and make them a viable approach towards the chal- lenge of real-world engineering problems. This class of design optimisation problems is by nature multi-disciplinary. In the present work we establish enhanced optimisation capabil- ities within the Nimrod/O tool for massively distributed execution of computational tasks through cluster and computational grid resources, and develop the potential to combine and benefit from all the possible available technological advancements, both software and hardware. We develop the interface between a Free Form Deformation geometry manage- ment in-house code with the 2D airfoil aerodynamic efficiency evaluation tool XFoil, and the well established multi-objective heuristic optimisation algorithm NSGA-II. A simple airfoil design problem has been defined to demonstrate the functionality of the design sys- tem, but also to accommodate a framework for future developments and testing with other state-of-the-art optimisation algorithms such as the Multi-Objective Genetic Algorithm (MOGA) and the Multi-Objective Tabu Search (MOTS) techniques. Ultimately, heav- ily computationally expensive industrial design cases can be realised within the presented framework that could not be investigated before. © 2012 by the authors. Published by the American Institute of Aeronautics and Astronautics, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the modern engineering design cycle the use of computational tools becomes a necessity. The complexity of the engineering systems under consideration for design increases dramatically as the demands for advanced and innovative design concepts and engineering products is expanding. At the same time the advancements in the available technology in terms of computational resources and power, as well as the intelligence of the design software, accommodate these demands and make them a viable approach towards the challenge of real-world engineering problems. This class of design optimisation problems is by nature multi-disciplinary. In the present work we establish enhanced optimisation capabilities within the Nimrod/O tool for massively distributed execution of computational tasks through cluster and computational grid resources, and develop the potential to combine and benefit from all the possible available technological advancements, both software and hardware. We develop the interface between a Free Form Deformation geometry management in-house code with the 2D airfoil aerodynamic efficiency evaluation tool XFoil, and the well established multi-objective heuristic optimisation algorithm NSGA-II. A simple airfoil design problem has been defined to demonstrate the functionality of the design system, but also to accommodate a framework for future developments and testing with other state-of-the-art optimisation algorithms such as the Multi-Objective Genetic Algorithm (MOGA) and the Multi-Objective Tabu Search (MOTS) techniques. Ultimately, heavily computationally expensive industrial design cases can be realised within the presented framework that could not be investigated before. ©2012 AIAA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we study the general problem of reconstructing a function, defined on a finite lattice from a set of incomplete, noisy and/or ambiguous observations. The goal of this work is to demonstrate the generality and practical value of a probabilistic (in particular, Bayesian) approach to this problem, particularly in the context of Computer Vision. In this approach, the prior knowledge about the solution is expressed in the form of a Gibbsian probability distribution on the space of all possible functions, so that the reconstruction task is formulated as an estimation problem. Our main contributions are the following: (1) We introduce the use of specific error criteria for the design of the optimal Bayesian estimators for several classes of problems, and propose a general (Monte Carlo) procedure for approximating them. This new approach leads to a substantial improvement over the existing schemes, both regarding the quality of the results (particularly for low signal to noise ratios) and the computational efficiency. (2) We apply the Bayesian appraoch to the solution of several problems, some of which are formulated and solved in these terms for the first time. Specifically, these applications are: teh reconstruction of piecewise constant surfaces from sparse and noisy observationsl; the reconstruction of depth from stereoscopic pairs of images and the formation of perceptual clusters. (3) For each one of these applications, we develop fast, deterministic algorithms that approximate the optimal estimators, and illustrate their performance on both synthetic and real data. (4) We propose a new method, based on the analysis of the residual process, for estimating the parameters of the probabilistic models directly from the noisy observations. This scheme leads to an algorithm, which has no free parameters, for the restoration of piecewise uniform images. (5) We analyze the implementation of the algorithms that we develop in non-conventional hardware, such as massively parallel digital machines, and analog and hybrid networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Though the motivation for developing Ambient Assisted Living (AAL) systems is incontestable, significant challenges exist in realizing the ambience that is essential to the success of such systems. By definition, an AAL system must be omnipresent, tracking occupant activities in the home and identifying those situations where assistance is needed or would be welcomed. Embedded sensors offer an attractive mechanism for realizing ambience as their form factor and harnessing of wireless technologies aid in their seamless integration into pre-existing environments. However, the heterogeneity of the end-user population, their disparate needs and the differing environments in which they inhabit, all pose particular problems regarding sensor integration and management

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Social network analysts have tried to capture the idea of social role explicitly by proposing a framework that precisely gives conditions under which group actors are playing equivalent roles. They term these methods positional analysis techniques. The most general definition is regular equivalence which captures the idea that equivalent actors are related in a similar way to equivalent alters. Regular equivalence gives rise to a whole class of partitions on a network. Given a network we have two different computational problems. The first is how to find a particular regular equivalence. An algorithm exists to find the largest regular partition but there are not efficient algorithms to test whether there is a regular k-partition. That is a partition in k groups that is regular. In addition, when dealing with real data, it is unlikely that any regular partitions exist. To overcome this problem relaxations of regular equivalence have been proposed along with optimisation techniques to find nearly regular partitions. In this paper we review the algorithms that have developed to find particular regular equivalences and look at some of the recent theoretical results which give an insight into the complexity of finding regular partitions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Second Language Processing examines the problems facing learners in the second language classroom from the theoretical perspectives of Processing Instruction (structured input) and Enhanced Input. These two theories are brought to bear on a variety of processing problems, such as the difficulty of connecting second language grammatical forms encoding tense and mood as well as noun-adjective agreement with their meaning. Empirical studies examine a range of languages including Japanese, Italian and Spanish, through which the authors suggest practical solutions to these processing problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud computing technology has rapidly evolved over the last decade, offering an alternative way to store and work with large amounts of data. However data security remains an important issue particularly when using a public cloud service provider. The recent area of homomorphic cryptography allows computation on encrypted data, which would allow users to ensure data privacy on the cloud and increase the potential market for cloud computing. A significant amount of research on homomorphic cryptography appeared in the literature over the last few years; yet the performance of existing implementations of encryption schemes remains unsuitable for real time applications. One way this limitation is being addressed is through the use of graphics processing units (GPUs) and field programmable gate arrays (FPGAs) for implementations of homomorphic encryption schemes. This review presents the current state of the art in this promising new area of research and highlights the interesting remaining open problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper responds to demands for greater academic investigation into environmental protection, specifically the practical and structural problems which underpin regulatory compliance in the planning system. It critiques traditional theories of regulation and answers calls for the development of a thematic lens to facilitate the scrutiny of not only operational practice, but also the broader institutional regime. An empirical investigation builds upon the construct of really responsive regulation to study planning control and it becomes apparent that not only are there significant procedural planning difficulties facing regulatory compliance, but also that a much wider raft of issues must be considered if the complex equation is to be solved. The findings demonstrate how theory can be applied to enrich our rudimentary understanding of deep-seated problems and foster insights into areas of structural importance which are relevant to both planning and the wider regulatory arena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital signatures are an important primitive for building secure systems and are used in most real-world security protocols. However, almost all popular signature schemes are either based on the factoring assumption (RSA) or the hardness of the discrete logarithm problem (DSA/ECDSA). In the case of classical cryptanalytic advances or progress on the development of quantum computers, the hardness of these closely related problems might be seriously weakened. A potential alternative approach is the construction of signature schemes based on the hardness of certain lattice problems that are assumed to be intractable by quantum computers. Due to significant research advancements in recent years, lattice-based schemes have now become practical and appear to be a very viable alternative to number-theoretic cryptography. In this article, we focus on recent developments and the current state of the art in lattice-based digital signatures and provide a comprehensive survey discussing signature schemes with respect to practicality. Additionally, we discuss future research areas that are essential for the continued development of lattice-based cryptography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Symposium Chair: Dr Jennifer McGaughey

Title: Early Warning Systems: problems, pragmatics and potential

Early Warning Systems (EWS) provide a mechanism for staff to recognise, refer and manage deteriorating patients on general hospital wards. Implementation of EWS in practice has required considerable change in the delivery of critical care across hospitals. Drawing their experience of these changes the authors will demonstrate the problems and potential of using EWS to improve patient outcomes.

The first paper (Dr Jennifer McGaughey: Early Warning Systems: what works?) reviews the research evidence regarding the factors that support or constrain the implementation of Early Warning System (EWS) in practice. These findings explain those processes which impact on the successful achievement of patient outcomes. In order to improve detection and standardise practice National EWS have been implemented in the United Kingdom. The second paper (Catherine Plowright: The implementation of the National EWS in a District General Hospital) focuses on the process of implementing and auditing a National EWS. This process improvement is essential to contribute to future collaborative research and collection of robust datasets to improve patient safety as recommended by the Royal College of Physicians (RCP 2012). To successfully implement NEWS in practice requires strategic planning and staff education. The practical issues of training staff is discussed in the third paper. This paper (Collette Laws-Chapman: Simulation as a modality to embed the use of Early Warning Systems) focuses on using simulation and structured debrief to enhance learning in the early recognition and management of deteriorating patients. This session emphasises the importance of cognitive and social skills developed alongside practical skills in the simulated setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Perfect information is seldom available to man or machines due to uncertainties inherent in real world problems. Uncertainties in geographic information systems (GIS) stem from either vague/ambiguous or imprecise/inaccurate/incomplete information and it is necessary for GIS to develop tools and techniques to manage these uncertainties. There is a widespread agreement in the GIS community that although GIS has the potential to support a wide range of spatial data analysis problems, this potential is often hindered by the lack of consistency and uniformity. Uncertainties come in many shapes and forms, and processing uncertain spatial data requires a practical taxonomy to aid decision makers in choosing the most suitable data modeling and analysis method. In this paper, we: (1) review important developments in handling uncertainties when working with spatial data and GIS applications; (2) propose a taxonomy of models for dealing with uncertainties in GIS; and (3) identify current challenges and future research directions in spatial data analysis and GIS for managing uncertainties.