56 resultados para theory of the dependence of resource
Resumo:
So far, social psychology in sport has preliminary focused on team cohesion, and many studies and meta analyses tried to demonstrate a relation between cohesiveness of a team and it's performance. How a team really co-operates and how the individual actions are integrated towards a team action is a question that has received relatively little attention in research. This may, at least in part, be due to a lack of a theoretical framework for collective actions, a dearth that has only recently begun to challenge sport psychologists. In this presentation a framework for a comprehensive theory of teams in sport is outlined and its potential to integrate the following presentations is put up for discussion. Based on a model developed by von Cranach, Ochsenbein and Valach (1986), teams are information processing organisms, and team actions need to be investigated on two levels: the individual team member and the group as an entity. Elements to be considered are the task, the social structure, the information processing structure and the execution structure. Obviously, different task require different social structures, communication and co-ordination. From a cognitivist point of view, internal representations (or mental models) guide the behaviour mainly in situations requiring quick reactions and adaptations, were deliberate or contingency planning are difficult. In sport teams, the collective representation contains the elements of the team situation, that is team task and team members, and of the team processes, that is communication and co-operation. Different meta-perspectives may be distinguished and bear a potential to explain the actions of efficient teams. Cranach, M. von, Ochsenbein, G., & Valach, L. (1986).The group as a self-active system: Outline of a theory of group action. European Journal of Social Psychology, 16, 193-229.
Resumo:
The first section of this chapter starts with the Buffon problem, which is one of the oldest in stochastic geometry, and then continues with the definition of measures on the space of lines. The second section defines random closed sets and related measurability issues, explains how to characterize distributions of random closed sets by means of capacity functionals and introduces the concept of a selection. Based on this concept, the third section starts with the definition of the expectation and proves its convexifying effect that is related to the Lyapunov theorem for ranges of vector-valued measures. Finally, the strong law of large numbers for Minkowski sums of random sets is proved and the corresponding limit theorem is formulated. The chapter is concluded by a discussion of the union-scheme for random closed sets and a characterization of the corresponding stable laws.
Resumo:
Researchers suggest that personalization on the Semantic Web adds up to a Web 3.0 eventually. In this Web, personalized agents process and thus generate the biggest share of information rather than humans. In the sense of emergent semantics, which supplements traditional formal semantics of the Semantic Web, this is well conceivable. An emergent Semantic Web underlying fuzzy grassroots ontology can be accomplished through inducing knowledge from users' common parlance in mutual Web 2.0 interactions [1]. These ontologies can also be matched against existing Semantic Web ontologies, to create comprehensive top-level ontologies. On the Web, if augmented with information in the form of restrictions andassociated reliability (Z-numbers) [2], this collection of fuzzy ontologies constitutes an important basis for an implementation of Zadeh's restriction-centered theory of reasoning and computation (RRC) [3]. By considering real world's fuzziness, RRC differs from traditional approaches because it can handle restrictions described in natural language. A restriction is an answer to a question of the value of a variable such as the duration of an appointment. In addition to mathematically well-defined answers, RRC can likewise deal with unprecisiated answers as "about one hour." Inspired by mental functions, it constitutes an important basis to leverage present-day Web efforts to a natural Web 3.0. Based on natural language information, RRC may be accomplished with Z-number calculation to achieve a personalized Web reasoning and computation. Finally, through Web agents' understanding of natural language, they can react to humans more intuitively and thus generate and process information.
Resumo:
The paper deals with batch scheduling problems in process industries where final products arise from several successive chemical or physical transformations of raw materials using multi–purpose equipment. In batch production mode, the total requirements of intermediate and final products are partitioned into batches. The production start of a batch at a given level requires the availability of all input products. We consider the problem of scheduling the production of given batches such that the makespan is minimized. Constraints like minimum and maximum time lags between successive production levels, sequence–dependent facility setup times, finite intermediate storages, production breaks, and time–varying manpower contribute to the complexity of this problem. We propose a new solution approach using models and methods of resource–constrained project scheduling, which (approximately) solves problems of industrial size within a reasonable amount of time.
Resumo:
Introduction So far, social psychology in sport has preliminary focused on team cohesion, and many studies and meta-analyses tried to demonstrate a relation between cohesiveness of a team and its performance. How a team really co-operates and how the individual actions are integrated towards a team action is a question that has received relatively little attention in research. This may, at least in part, be due to a lack of a theoretical framework for collective actions, a dearth that has only recently begun to challenge sport psychologists. Objectives In this presentation a framework for a comprehensive theory of teams in sport is outlined and its potential to integrate research in the domain of team performance and, more specifically, the following presentations, is put up for discussion. Method Based on a model developed by von Cranach, Ochsenbein and Valach (1986), teams are considered to be information processing organisms, and team actions need to be investigated on two levels: the individual team member and the group as an entity. Elements to be considered are the task, the social structure, the information processing structure and the execution structure. Obviously, different task require different social structures, communication processes and co-ordination of individual movements. Especially in rapid interactive sports planning and execution of movements based on feedback loops are not possible. Deliberate planning may be a solution mainly for offensive actions, whereas defensive actions have to adjust to the opponent team's actions. Consequently, mental representations must be developed to allow a feed-forward regulation of team member's actions. Results and Conclusions Some preliminary findings based on this conceptual framework as well as further consequences for empirical investigations will be presented. References Cranach, M.v., Ochsenbein, G. & Valach, L. (1986). The group as a self-active system: Outline of a theory of group action. European Journal of Social Psychology, 16, 193-229.
Resumo:
We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.