874 resultados para Theory of Complex Socialization
Resumo:
Introduction So far, social psychology in sport has preliminary focused on team cohesion, and many studies and meta-analyses tried to demonstrate a relation between cohesiveness of a team and its performance. How a team really co-operates and how the individual actions are integrated towards a team action is a question that has received relatively little attention in research. This may, at least in part, be due to a lack of a theoretical framework for collective actions, a dearth that has only recently begun to challenge sport psychologists. Objectives In this presentation a framework for a comprehensive theory of teams in sport is outlined and its potential to integrate research in the domain of team performance and, more specifically, the following presentations, is put up for discussion. Method Based on a model developed by von Cranach, Ochsenbein and Valach (1986), teams are considered to be information processing organisms, and team actions need to be investigated on two levels: the individual team member and the group as an entity. Elements to be considered are the task, the social structure, the information processing structure and the execution structure. Obviously, different task require different social structures, communication processes and co-ordination of individual movements. Especially in rapid interactive sports planning and execution of movements based on feedback loops are not possible. Deliberate planning may be a solution mainly for offensive actions, whereas defensive actions have to adjust to the opponent team's actions. Consequently, mental representations must be developed to allow a feed-forward regulation of team member's actions. Results and Conclusions Some preliminary findings based on this conceptual framework as well as further consequences for empirical investigations will be presented. References Cranach, M.v., Ochsenbein, G. & Valach, L. (1986). The group as a self-active system: Outline of a theory of group action. European Journal of Social Psychology, 16, 193-229.
Resumo:
Recent experiments revealed that the fruit fly Drosophila melanogaster has a dedicated mechanism for forgetting: blocking the G-protein Rac leads to slower and activating Rac to faster forgetting. This active form of forgetting lacks a satisfactory functional explanation. We investigated optimal decision making for an agent adapting to a stochastic environment where a stimulus may switch between being indicative of reward or punishment. Like Drosophila, an optimal agent shows forgetting with a rate that is linked to the time scale of changes in the environment. Moreover, to reduce the odds of missing future reward, an optimal agent may trade the risk of immediate pain for information gain and thus forget faster after aversive conditioning. A simple neuronal network reproduces these features. Our theory shows that forgetting in Drosophila appears as an optimal adaptive behavior in a changing environment. This is in line with the view that forgetting is adaptive rather than a consequence of limitations of the memory system.
Resumo:
(Full text is available at http://www.manu.edu.mk/prilozi). New generation genomic platforms enable us to decipher the complex genetic basis of complex diseases and Balkan Endemic Nephropathy (BEN) at a high-throughput basis. They give valuable information about predisposing Single Nucleotide Polymorphisms (SNPs), Copy Number Variations (CNVs) or Loss of Heterozygosity (LOH) (using SNP-array) and about disease-causing mutations along the whole sequence of candidate-genes (using Next Generation Sequencing). This information could be used for screening of individuals in risk families and moving the main medicine stream to the prevention. They also might have an impact on more effective treatment. Here we discuss these genomic platforms and report some applications of SNP-array technology in a case with familial nephrotic syndrome. Key words: complex diseases, genome wide association studies, SNP, genomic arrays, next generation sequ-encing.
Resumo:
Modeling of future water systems at the regional scale is a difficult task due to the complexity of current structures (multiple competing water uses, multiple actors, formal and informal rules) both temporally and spatially. Representing this complexity in the modeling process is a challenge that can be addressed by an interdisciplinary and holistic approach. The assessment of the water system of the Crans-Montana-Sierre area (Switzerland) and its evolution until 2050 were tackled by combining glaciological, hydrogeological, and hydrological measurements and modeling with the evaluation of water use through documentary, statistical and interview-based analyses. Four visions of future regional development were co-produced with a group of stakeholders and were then used as a basis for estimating future water demand. The comparison of the available water resource and the water demand at monthly time scale allowed us to conclude that for the four scenarios socioeconomic factors will impact on the future water systems more than climatic factors. An analysis of the sustainability of the current and future water systems based on four visions of regional development allowed us to identify those scenarios that will be more sustainable and that should be adopted by the decision-makers. The results were then presented to the stakeholders through five key messages. The challenges of communicating the results in such a way with stakeholders are discussed at the end of the article.
Resumo:
We develop further the effective fluid theory of stationary branes. This formalism applies to stationary blackfolds as well as to other equilibrium brane systems at finite temperature. The effective theory is described by a Lagrangian containing the information about the elastic dynamics of the brane embedding as well as the hydrodynamics of the effective fluid living on the brane. The Lagrangian is corrected order-by-order in a derivative expansion, where we take into account the dipole moment of the brane which encompasses finite-thickness corrections, including transverse spin. We describe how to extract the thermodynamics from the Lagrangian and we obtain constraints on the higher-derivative terms with one and two derivatives. These constraints follow by comparing the brane thermodynamics with the conserved currents associated with background Killing vector fields. In particular, we fix uniquely the one- and two-derivative terms describing the coupling of the transverse spin to the background space-time. Finally, we apply our formalism to two blackfold examples, the black tori and charged black rings and compare the latter to a numerically generated solution.
Resumo:
We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.
Resumo:
Purpose In this study, we show the use of three-dimensional printing models for preoperative planning of surgery for patients with complex aortic arch anomalies. Description A 70-year-old man with an extensively arteriosclerotic aneurysm reaching from the ascending aorta to the descending aorta was referred to our center for complete aortic arch replacement. We visualized and reconstructed computed tomography data of the patient and fabricated a flexible three-dimensional model of the aortic arch including the aneurysm. Evaluation This model was very helpful for the preoperative decision making and planning of the frozen elephant trunk procedure owing to the exact and lifelike illustration of the native aortic arch. Conclusions Three-dimensional models are helpful in preoperative planning and postoperative evaluation of frozen elephant trunk procedures in patients with complex aortic anatomy.
Development of meta-representations: Procedural metacognition and the relationship to Theory of Mind
Resumo:
In several studies it was shown that metacognitive ability is crucial for children and their success in school. Much less is known about the emergence of that ability and its relationship to other meta-representations like Theory of Mind competencies. In the past years, a growing literature has suggested that metacognition and Theory of Mind could theoretically be assumed to belong to the same developmental concept. Since then only a few studies showed empirically evidence that metacognition and Theory of Mind are related. But these studies focused on declarative metacognitive knowledge rather than on procedural metacognitive monitoring like in the present study: N = 159 children were first tested shortly before making the transition to school (aged between 5 1/2 and 7 1/2 years) and one year later at the end of their first grade. Analyses suggest that there is in fact a significant relation between early metacognitive monitoring skills (procedural metacognition) and later Theory of Mind competencies. Notably, language seems to play a crucial role in this relationship. Thus our results bring new insights in the research field of the development of meta-representation and support the view that metacognition and Theory of Mind are indeed interrelated, but the precise mechanisms yet remain unclear.
Resumo:
OBJECTIVE Obtaining new details of radial motion of left ventricular (LV) segments using velocity-encoding cardiac MRI. METHODS Cardiac MR examinations were performed on 14 healthy volunteers aged between 19 and 26 years. Cine images for navigator-gated phase contrast velocity mapping were acquired using a black blood segmented κ-space spoiled gradient echo sequence with a temporal resolution of 13.8 ms. Peak systolic and diastolic radial velocities as well as radial velocity curves were obtained for 16 ventricular segments. RESULTS Significant differences among peak radial velocities of basal and mid-ventricular segments have been recorded. Particular patterns of segmental radial velocity curves were also noted. An additional wave of outward radial movement during the phase of rapid ventricular filling, corresponding to the expected timing of the third heart sound, appeared of particular interest. CONCLUSION The technique has allowed visualization of new details of LV radial wall motion. In particular, higher peak systolic radial velocities of anterior and inferior segments are suggestive of a relatively higher dynamics of anteroposterior vs lateral radial motion in systole. Specific patterns of radial motion of other LV segments may provide additional insights into LV mechanics. ADVANCES IN KNOWLEDGE The outward radial movement of LV segments impacted by the blood flow during rapid ventricular filling provides a potential substrate for the third heart sound. A biphasic radial expansion of the basal anteroseptal segment in early diastole is likely to be related to the simultaneous longitudinal LV displacement by the stretched great vessels following repolarization and their close apposition to this segment.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Quarks were introduced 50 years ago opening the road towards our understanding of the elementary constituents of matter and their fundamental interactions. Since then, a spectacular progress has been made with important discoveries that led to the establishment of the Standard Theory that describes accurately the basic constituents of the observable matter, namely quarks and leptons, interacting with the exchange of three fundamental forces, the weak, electromagnetic and strong force. Particle physics is now entering a new era driven by the quest of understanding of the composition of our Universe such as the unobservable (dark) matter, the hierarchy of masses and forces, the unification of all fundamental interactions with gravity in a consistent quantum framework, and several other important questions. A candidate theory providing answers to many of these questions is string theory that replaces the notion of point particles by extended objects, such as closed and open strings. In this short note, I will give a brief overview of string unification, describe in particular how quarks and leptons can emerge and discuss what are possible predictions for particle physics and cosmology that could test these ideas.
Resumo:
Spurred by the consumer market, companies increasingly deploy smartphones or tablet computers in their operations. However, unlike private users, companies typically struggle to cover their needs with existing applications, and therefore expand mobile software platforms through customized applications from multiple software vendors. Companies thereby combine the concepts of multi-sourcing and software platform ecosystems in a novel platform-based multi-sourcing setting. This implies, however, the clash of two different approaches towards the coordination of the underlying one-to-many inter-organizational relationships. So far, however, little is known about impacts of merging coordination approaches. Relying on convention theory, we addresses this gap by analyzing a platform-based multi-sourcing project between a client and six software vendors, that develop twenty-three custom-made applications on a common platform (Android). In doing so, we aim to understand how unequal coordination approaches merge, and whether and for what reason particular coordination mechanisms, design decisions, or practices disappear, while new ones emerge.