988 resultados para problem-finding
Resumo:
This research focuses on creativity and innovation management in organizations. We present a model of intervention that aims at establishing a culture of organizational innovation through the internal development of individual and team creativity focusing on problem solving. The model relies on management’s commitment and in the organization’s talented people (creative leaders and employees) as a result of their ability in defining a better organization. The design follows Min Basadur’s problem solving approach consisting of problem finding, fact finding, problem definition, solution finding and decision implementation. These steps are carried out using specific techniques and procedures that will link creative people and management in order to initiate the process until problems are defined. For each defined problem, project teams will develop possible solutions and implement these decisions. Thus, a system of transformation of the individual and team creativity into organizational innovation can be established.
Resumo:
Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).
Resumo:
The Variational Asymptotic Method (VAM) is used for modeling a coupled non-linear electromechanical problem finding applications in aircrafts and Micro Aerial Vehicle (MAV) development. VAM coupled with geometrically exact kinematics forms a powerful tool for analyzing a complex nonlinear phenomena as shown previously by many in the literature 3 - 7] for various challenging problems like modeling of an initially twisted helicopter rotor blades, matrix crack propagation in a composite, modeling of hyper elastic plates and various multi-physics problems. The problem consists of design and analysis of a piezocomposite laminate applied with electrical voltage(s) which can induce direct and planar distributed shear stresses and strains in the structure. The deformations are large and conventional beam theories are inappropriate for the analysis. The behavior of an elastic body is completely understood by its energy. This energy must be integrated over the cross-sectional area to obtain the 1-D behavior as is typical in a beam analysis. VAM can be used efficiently to approximate 3-D strain energy as closely as possible. To perform this simplification, VAM makes use of thickness to width, width to length, width multiplied by initial twist and strain as small parameters embedded in the problem definition and provides a way to approach the exact solution asymptotically. In this work, above mentioned electromechanical problem is modeled using VAM which breaks down the 3-D elasticity problem into two parts, namely a 2-D non-linear cross-sectional analysis and a 1-D non-linear analysis, along the reference curve. The recovery relations obtained as a by-product in the cross-sectional analysis earlier are used to obtain 3-D stresses, displacements and velocity contours. The piezo-composite laminate which is chosen for an initial phase of computational modeling is made up of commercially available Macro Fiber Composites (MFCs) stacked together in an arbitrary lay-up and applied with electrical voltages for actuation. The expressions of sectional forces and moments as obtained from cross-sectional analysis in closed-form show the electro-mechanical coupling and relative contribution of electric field in individual layers of the piezo-composite laminate. The spatial and temporal constitutive law as obtained from the cross-sectional analysis are substituted into 1-D fully intrinsic, geometrically exact equilibrium equations of motion and 1-D intrinsic kinematical equations to solve for all 1-D generalized variables as function of time and an along the reference curve co-ordinate, x(1).
Resumo:
For most people design is a mystery. The products of design are integrated into our daily lives to the point that design has become invisible to us. However. what is subsumed in design practice is a creative problem-solving process that is applicable as a teaching strategy as well as a method for teaching the subject of design. The purpose of this study was to inquire into the current classroom practice of Ontario Visual Arts and Technological Education teachers, understand the goals of Ontario government curriculum developers, and explore the position held by the professional design community on secondary school design education. Data for this study were collected from: (a) a textual analysis of 4 Ministry curriculum documents; (b) interviews with JO stakeholders; (c) unobtrusive observations and informal conversations conducted at 7 secondary school open house events; and (d) observation of 2 sessions of an AQ course for Design and Technology. The research design modeled the design process and was divided into 2 parts: a discovery or problem-finding phase and a discussion or problem-solving phase. The results showed that design is misunderstood and misused; it has become lost between visual arts and technology where neither program holds responsibility for its delivery; students mistake working on computers for design practice; and while there is a desire within the professional community to have a voice in secondary school design education. there is no forum for participation. The technology-driven paradigm shift taking place in society today calls for a new framework for tellching and practicing dcsign. Further research is required; howcvcr. in the meantime. secondary school educators might benefit from professional development and classroom support from the professional dcsign community.
Resumo:
L’objet du travail est d’étudier les prolongements de sous-copules. Un cas important de l’utilisation de tels prolongements est l’estimation non paramétrique d’une copule par le lissage d’une sous-copule (la copule empirique). Lorsque l’estimateur obtenu est une copule, cet estimateur est un prolongement de la souscopule. La thèse présente au chapitre 2 la construction et la convergence uniforme d’un estimateur bona fide d’une copule ou d’une densité de copule. Cet estimateur est un prolongement de type copule empirique basé sur le lissage par le produit tensoriel de fonctions de répartition splines. Le chapitre 3 donne la caractérisation de l’ensemble des prolongements possibles d’une sous-copule. Ce sujet a été traité par le passé; mais les constructions proposées ne s’appliquent pas à la dépendance dans des espaces très généraux. Le chapitre 4 s’attèle à résoudre le problème suivant posé par [Carley, 2002]. Il s’agit de trouver la borne supérieure des prolongements en dimension 3 d’une sous-copule de domaine fini.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
The use of collaborative assignments for assessment is a risky undertaking for students and course designers. Yet the benefits, in terms of core learning outcomes, competencies, collaborative sense making and student involvement, suggest that the effort is worthwhile. Formal descriptions and rules do little to ameliorate the perception of risk and increased anxiety by students. (Ryan, 2007). BEB100 Introducing Professional Learning is a faculty-wide foundation unit with over 1300 students from 19 disciplines across the Faculty of the Built Environment and Engineering (“BEE”) at the Queensland University of Technology (“QUT”), Brisbane, Australia. Finding order in chaos outlines the approach and justification, assessment criteria, learning resources, teamwork tools, tutorial management, communication strategies, 2007-09 Student Learning Experience Survey results, annual improvements, findings and outcomes.
Resumo:
Tanner Graph representation of linear block codes is widely used by iterative decoding algorithms for recovering data transmitted across a noisy communication channel from errors and erasures introduced by the channel. The stopping distance of a Tanner graph T for a binary linear block code C determines the number of erasures correctable using iterative decoding on the Tanner graph T when data is transmitted across a binary erasure channel using the code C. We show that the problem of finding the stopping distance of a Tanner graph is hard to approximate within any positive constant approximation ratio in polynomial time unless P = NP. It is also shown as a consequence that there can be no approximation algorithm for the problem achieving an approximation ratio of 2(log n)(1-epsilon) for any epsilon > 0 unless NP subset of DTIME(n(poly(log n))).
Resumo:
Some dynamic properties for a light ray suffering specular reflections inside a periodically corrugated waveguide are studied. The dynamics of the model is described in terms of a two dimensional nonlinear area preserving map. We show that the phase space is mixed in the sense that there are KAM islands surrounded by a large chaotic sea that is confined by two invariant spanning curves. We have used a connection with the Standard Mapping near a transition from local to global chaos and found the position of these two invariant spanning curves limiting the size of the chaotic sea as function of the control parameter.
Resumo:
n this paper we deal with the problem of obtaining the set of k-additive measures dominating a fuzzy measure. This problem extends the problem of deriving the set of probabilities dominating a fuzzy measure, an important problem appearing in Decision Making and Game Theory. The solution proposed in the paper follows the line developed by Chateauneuf and Jaffray for dominating probabilities and continued by Miranda et al. for dominating k-additive belief functions. Here, we address the general case transforming the problem into a similar one such that the involved set functions have non-negative Möbius transform; this simplifies the problem and allows a result similar to the one developed for belief functions. Although the set obtained is very large, we show that the conditions cannot be sharpened. On the other hand, we also show that it is possible to define a more restrictive subset, providing a more natural extension of the result for probabilities, such that it is possible to derive any k-additive dominating measure from it.
Resumo:
We consider the problem of how to efficiently and safely design dose finding studies. Both current and novel utility functions are explored using Bayesian adaptive design methodology for the estimation of a maximum tolerated dose (MTD). In particular, we explore widely adopted approaches such as the continual reassessment method and minimizing the variance of the estimate of an MTD. New utility functions are constructed in the Bayesian framework and are evaluated against current approaches. To reduce computing time, importance sampling is implemented to re-weight posterior samples thus avoiding the need to draw samples using Markov chain Monte Carlo techniques. Further, as such studies are generally first-in-man, the safety of patients is paramount. We therefore explore methods for the incorporation of safety considerations into utility functions to ensure that only safe and well-predicted doses are administered. The amalgamation of Bayesian methodology, adaptive design and compound utility functions is termed adaptive Bayesian compound design (ABCD). The performance of this amalgamation of methodology is investigated via the simulation of dose finding studies. The paper concludes with a discussion of results and extensions that could be included into our approach.
Resumo:
Entity-oriented search has become an essential component of modern search engines. It focuses on retrieving a list of entities or information about the specific entities instead of documents. In this paper, we study the problem of finding entity related information, referred to as attribute-value pairs, that play a significant role in searching target entities. We propose a novel decomposition framework combining reduced relations and the discriminative model, Conditional Random Field (CRF), for automatically finding entity-related attribute-value pairs from free text documents. This decomposition framework allows us to locate potential text fragments and identify the hidden semantics, in the form of attribute-value pairs for user queries. Empirical analysis shows that the decomposition framework outperforms pattern-based approaches due to its capability of effective integration of syntactic and semantic features.
Resumo:
In this paper we analyse properties of the message expansion algorithm of SHA-1 and describe a method of finding differential patterns that may be used to attack reduced versions of SHA-1. We show that the problem of finding optimal differential patterns for SHA-1 is equivalent to the problem of finding minimal weight codeword in a large linear code. Finally, we present a number of patterns of different lengths suitable for finding collisions and near-collisions and discuss some bounds on minimal weights of them.
Resumo:
Information available on company websites can help people navigate to the offices of groups and individuals within the company. Automatically retrieving this within-organisation spatial information is a challenging AI problem This paper introduces a novel unsupervised pattern-based method to extract within-organisation spatial information by taking advantage of HTML structure patterns, together with a novel Conditional Random Fields (CRF) based method to identify different categories of within-organisation spatial information. The results show that the proposed method can achieve a high performance in terms of F-Score, indicating that this purely syntactic method based on web search and an analysis of HTML structure is well-suited for retrieving within-organisation spatial information.