985 resultados para problem complexity


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A case study of an aircraft engine manufacturer is used to analyze the effects of management levers on the lead time and design errors generated in an iteration-intensive concurrent engineering process. The levers considered are amount of design-space exploration iteration, degree of process concurrency, and timing of design reviews. Simulation is used to show how the ideal combination of these levers can vary with changes in design problem complexity, which can increase, for instance, when novel technology is incorporated in a design. Results confirm that it is important to consider multiple iteration-influencing factors and their interdependencies to understand concurrent processes, because the factors can interact with confounding effects. The article also demonstrates a new approach to derive a system dynamics model from a process task network. The new approach could be applied to analyze other concurrent engineering scenarios. © The Author(s) 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONTEXT The necessity of specific intervention components for the successful treatment of patients with posttraumatic stress disorder is the subject of controversy. OBJECTIVE To investigate the complexity of clinical problems as a moderator of relative effects between specific and nonspecific psychological interventions. METHODS We included 18 randomized controlled trials, directly comparing specific and nonspecific psychological interventions. We conducted moderator analyses, including the complexity of clinical problems as predictor. RESULTS Our results have confirmed the moderate superiority of specific over nonspecific psychological interventions; however, the superiority was small in studies with complex clinical problems and large in studies with noncomplex clinical problems. CONCLUSIONS For patients with complex clinical problems, our results suggest that particular nonspecific psychological interventions may be offered as an alternative to specific psychological interventions. In contrast, for patients with noncomplex clinical problems, specific psychological interventions are the best treatment option.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Dans des contextes de post-urgence tels que le vit la partie occidentale de la République Démocratique du Congo (RDC), l’un des défis cruciaux auxquels font face les hôpitaux ruraux est de maintenir un niveau de médicaments essentiels dans la pharmacie. Sans ces médicaments pour traiter les maladies graves, l’impact sur la santé de la population est significatif. Les hôpitaux encourent également des pertes financières dues à la péremption lorsque trop de médicaments sont commandés. De plus, les coûts du transport des médicaments ainsi que du superviseur sont très élevés pour les hôpitaux isolés ; les coûts du transport peuvent à eux seuls dépasser ceux des médicaments. En utilisant la province du Bandundu, RDC pour une étude de cas, notre recherche tente de déterminer la faisabilité (en termes et de la complexité du problème et des économies potentielles) d’un problème de routage synchronisé pour la livraison de médicaments et pour les visites de supervision. Nous proposons une formulation du problème de tournées de véhicules avec capacité limitée qui gère plusieurs exigences nouvelles, soit la synchronisation des activités, la préséance et deux fréquences d’activités. Nous mettons en œuvre une heuristique « cluster first, route second » avec une base de données géospatiales qui permet de résoudre le problème. Nous présentons également un outil Internet qui permet de visualiser les solutions sur des cartes. Les résultats préliminaires de notre étude suggèrent qu’une solution synchronisée pourrait offrir la possibilité aux hôpitaux ruraux d’augmenter l’accessibilité des services médicaux aux populations rurales avec une augmentation modique du coût de transport actuel.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis aims at developing a better understanding of unstructured strategic decision making processes and the conditions for achieving successful decision outcomes. Specifically it focuses on the processes used to make CRE (Corporate Real Estate) decisions. The starting point for this thesis is that our knowledge of such processes is incomplete. A comprehensive study of the most recent CRE literature together with Behavioural Organization Theory has provided a research framework for the exploration of CRE recommended =best practice‘, and of how organizational variables impact on and shape these practices. To reveal the fundamental differences between CRE decision-making in practice and the prescriptive =best practice‘ advocated in the CRE literature, a study of seven Italian management consulting firms was undertaken addressing the aspects of content and process of decisions. This thesis makes its primary contribution by identifying the importance and difficulty of finding the right balance between problem complexity, process richness and cohesion to ensure a decision-making process that is sufficiently rich and yet quick enough to deliver a prompt outcome. While doing so, this research also provides more empirical evidence to some of the most established theories of decision-making while reinterpreting their mono-dimensional arguments in a multi-dimensional model of successful decision-making.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The author aims at developing a better understanding of unstructured strategic decision making processes and the conditions for achieving successful decision outcomes. Specifically he investigates the processes used to make CRE (Corporate Real Estate) decisions. To reveal the fundamental differences between CRE decision-making in practice and the prescriptive ‘best practice’ advocated in the CRE literature, a study of seven leading Italian management consulting firms is undertaken addressing the aspects of content and process of decisions. This research makes its primary contribution by identifying the importance and difficulty of finding the right balance between problem complexity, process richness and cohesion to ensure a decision-making process that is sufficiently rich and yet quick enough to deliver a prompt outcome. While doing so, the study also provides more empirical evidence to some of the most established theories of decision-making, while reinterpreting their mono-dimensional arguments in a multi-dimensional model of successful decision-making.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cryptosystems based on the hardness of lattice problems have recently acquired much importance due to their average-case to worst-case equivalence, their conjectured resistance to quantum cryptanalysis, their ease of implementation and increasing practicality, and, lately, their promising potential as a platform for constructing advanced functionalities. In this work, we construct “Fuzzy” Identity Based Encryption from the hardness of the Learning With Errors (LWE) problem. We note that for our parameters, the underlying lattice problems (such as gapSVP or SIVP) are assumed to be hard to approximate within supexponential factors for adversaries running in subexponential time. We give CPA and CCA secure variants of our construction, for small and large universes of attributes. All our constructions are secure against selective-identity attacks in the standard model. Our construction is made possible by observing certain special properties that secret sharing schemes need to satisfy in order to be useful for Fuzzy IBE. We also discuss some obstacles towards realizing lattice-based attribute-based encryption (ABE).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We introduce Kamouflage: a new architecture for building theft-resistant password managers. An attacker who steals a laptop or cell phone with a Kamouflage-based password manager is forced to carry out a considerable amount of online work before obtaining any user credentials. We implemented our proposal as a replacement for the built-in Firefox password manager, and provide performance measurements and the results from experiments with large real-world password sets to evaluate the feasibility and effectiveness of our approach. Kamouflage is well suited to become a standard architecture for password managers on mobile devices.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We introduce the notion of distributed password-based public-key cryptography, where a virtual high-entropy private key is implicitly defined as a concatenation of low-entropy passwords held in separate locations. The users can jointly perform private-key operations by exchanging messages over an arbitrary channel, based on their respective passwords, without ever sharing their passwords or reconstituting the key. Focusing on the case of ElGamal encryption as an example, we start by formally defining ideal functionalities for distributed public-key generation and virtual private-key computation in the UC model. We then construct efficient protocols that securely realize them in either the RO model (for efficiency) or the CRS model (for elegance). We conclude by showing that our distributed protocols generalize to a broad class of “discrete-log”-based public-key cryptosystems, which notably includes identity-based encryption. This opens the door to a powerful extension of IBE with a virtual PKG made of a group of people, each one memorizing a small portion of the master key.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We analyse the security of the cryptographic hash function LAKE-256 proposed at FSE 2008 by Aumasson, Meier and Phan. By exploiting non-injectivity of some of the building primitives of LAKE, we show three different collision and near-collision attacks on the compression function. The first attack uses differences in the chaining values and the block counter and finds collisions with complexity 233. The second attack utilizes differences in the chaining values and salt and yields collisions with complexity 242. The final attack uses differences only in the chaining values to yield near-collisions with complexity 299. All our attacks are independent of the number of rounds in the compression function. We illustrate the first two attacks by showing examples of collisions and near-collisions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Design Science is the process of solving ‘wicked problems’ through designing, developing, instantiating, and evaluating novel solutions (Hevner, March, Park and Ram, 2004). Wicked problems are described as agent finitude in combination with problem complexity and normative constraint (Farrell and Hooker, 2013). In Information Systems Design Science, determining that problems are ‘wicked’ differentiates Design Science research from Solutions Engineering (Winter, 2008) and is a necessary part of proving the relevance to Information Systems Design Science research (Hevner, 2007; Iivari, 2007). Problem complexity is characterised as many problem components with nested, dependent and co-dependent relationships interacting through multiple feedback and feed-forward loops. Farrell and Hooker (2013) specifically state for wicked problems “it will often be impossible to disentangle the consequences of specific actions from those of other co-occurring interactions”. This paper discusses the application of an Enterprise Information Architecture modelling technique to disentangle the wicked problem complexity for one case. It proposes that such a modelling technique can be applied to other wicked problems and can lay the foundations for proving relevancy to DSR, provide solution pathways for artefact development, and aid to substantiate those elements required to produce Design Theory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Preface The 9th Australasian Conference on Information Security and Privacy (ACISP 2004) was held in Sydney, 13–15 July, 2004. The conference was sponsored by the Centre for Advanced Computing – Algorithms and Cryptography (ACAC), Information and Networked Security Systems Research (INSS), Macquarie University and the Australian Computer Society. The aims of the conference are to bring together researchers and practitioners working in areas of information security and privacy from universities, industry and government sectors. The conference program covered a range of aspects including cryptography, cryptanalysis, systems and network security. The program committee accepted 41 papers from 195 submissions. The reviewing process took six weeks and each paper was carefully evaluated by at least three members of the program committee. We appreciate the hard work of the members of the program committee and external referees who gave many hours of their valuable time. Of the accepted papers, there were nine from Korea, six from Australia, five each from Japan and the USA, three each from China and Singapore, two each from Canada and Switzerland, and one each from Belgium, France, Germany, Taiwan, The Netherlands and the UK. All the authors, whether or not their papers were accepted, made valued contributions to the conference. In addition to the contributed papers, Dr Arjen Lenstra gave an invited talk, entitled Likely and Unlikely Progress in Factoring. This year the program committee introduced the Best Student Paper Award. The winner of the prize for the Best Student Paper was Yan-Cheng Chang from Harvard University for his paper Single Database Private Information Retrieval with Logarithmic Communication. We would like to thank all the people involved in organizing this conference. In particular we would like to thank members of the organizing committee for their time and efforts, Andrina Brennan, Vijayakrishnan Pasupathinathan, Hartono Kurnio, Cecily Lenton, and members from ACAC and INSS.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Hamilton Jacobi Bellman (HJB) equation is central to stochastic optimal control (SOC) theory, yielding the optimal solution to general problems specified by known dynamics and a specified cost functional. Given the assumption of quadratic cost on the control input, it is well known that the HJB reduces to a particular partial differential equation (PDE). While powerful, this reduction is not commonly used as the PDE is of second order, is nonlinear, and examples exist where the problem may not have a solution in a classical sense. Furthermore, each state of the system appears as another dimension of the PDE, giving rise to the curse of dimensionality. Since the number of degrees of freedom required to solve the optimal control problem grows exponentially with dimension, the problem becomes intractable for systems with all but modest dimension.

In the last decade researchers have found that under certain, fairly non-restrictive structural assumptions, the HJB may be transformed into a linear PDE, with an interesting analogue in the discretized domain of Markov Decision Processes (MDP). The work presented in this thesis uses the linearity of this particular form of the HJB PDE to push the computational boundaries of stochastic optimal control.

This is done by crafting together previously disjoint lines of research in computation. The first of these is the use of Sum of Squares (SOS) techniques for synthesis of control policies. A candidate polynomial with variable coefficients is proposed as the solution to the stochastic optimal control problem. An SOS relaxation is then taken to the partial differential constraints, leading to a hierarchy of semidefinite relaxations with improving sub-optimality gap. The resulting approximate solutions are shown to be guaranteed over- and under-approximations for the optimal value function. It is shown that these results extend to arbitrary parabolic and elliptic PDEs, yielding a novel method for Uncertainty Quantification (UQ) of systems governed by partial differential constraints. Domain decomposition techniques are also made available, allowing for such problems to be solved via parallelization and low-order polynomials.

The optimization-based SOS technique is then contrasted with the Separated Representation (SR) approach from the applied mathematics community. The technique allows for systems of equations to be solved through a low-rank decomposition that results in algorithms that scale linearly with dimensionality. Its application in stochastic optimal control allows for previously uncomputable problems to be solved quickly, scaling to such complex systems as the Quadcopter and VTOL aircraft. This technique may be combined with the SOS approach, yielding not only a numerical technique, but also an analytical one that allows for entirely new classes of systems to be studied and for stability properties to be guaranteed.

The analysis of the linear HJB is completed by the study of its implications in application. It is shown that the HJB and a popular technique in robotics, the use of navigation functions, sit on opposite ends of a spectrum of optimization problems, upon which tradeoffs may be made in problem complexity. Analytical solutions to the HJB in these settings are available in simplified domains, yielding guidance towards optimality for approximation schemes. Finally, the use of HJB equations in temporal multi-task planning problems is investigated. It is demonstrated that such problems are reducible to a sequence of SOC problems linked via boundary conditions. The linearity of the PDE allows us to pre-compute control policy primitives and then compose them, at essentially zero cost, to satisfy a complex temporal logic specification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Long-term contractual decisions are the basis of an efficient risk management. However those types of decisions have to be supported with a robust price forecast methodology. This paper reports a different approach for long-term price forecast which tries to give answers to that need. Making use of regression models, the proposed methodology has as main objective to find the maximum and a minimum Market Clearing Price (MCP) for a specific programming period, and with a desired confidence level α. Due to the problem complexity, the meta-heuristic Particle Swarm Optimization (PSO) was used to find the best regression parameters and the results compared with the obtained by using a Genetic Algorithm (GA). To validate these models, results from realistic data are presented and discussed in detail.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Certificateless public key encryption can be classified into two types, namely, CLE and CLE † , both of which were introduced by Al-Riyami and Paterson in Asiacrypt 2003. Most works about certificateless public key encryption belong to CLE, where the partial secret key is uniquely determined by an entity’s identity. In CLE † , an entity’s partial secret key is not only determined by the identity information but also by his/her (partial) public key. Such techniques can enhance the resilience of certificateless public key encryption against a cheating KGC. In this paper, we first formalize the security definitions of CLE † . After that, we demonstrate the gap between the security model of CLE † and CLE, by showing the insecurity of a CLE † scheme proposed by Lai and Kou in PKC 2007. We give an attack that can successfully break the indistinguishability of their CLE † scheme, although their scheme can be proved secure in the security model of CLE. Therefore, it does not suffice to consider the security of CLE † in the security model of CLE. Finally, we show how to secure Lai-Kou’s scheme by providing a new scheme with the security proof in the model of CLE †