41 resultados para information processing model
Resumo:
The good news with regard to this (or any) chapter on the future of leadership is that there is one. There was a time when researchers called for a moratorium on new leadership theory and research (e.g., Miner, 1975) citing the uncertain future of the field. Then for a time there was a popular academic perspective that leadership did not really matter when it came to shaping organizational outcomes (Meindl & Ehrlich, 1987; Meindl, Ehrlich, & Dukerich, 1985; Pfeffer, 1977). That perspective was laid to rest by "realists" in the field (Day & Antonakis, 2012a) by means of empirical re-interpretation of the results used to support the position that leadership does not matter (Lieberson & O'Connor, 1972; Salancik & Pfeffer, 1977). Specifically, Day and Lord (1988) showed that when proper methodological concerns were addressed (e.g., controlling for industry and company size effects; incorporating appropriate time lags) that the impact of top-level leadership was considerable - explaining as much as 45% of the variance in measures of organizational performance. Despite some recent pessimistic sentiments about the "curiously unformed" state of leadership research and theory (Hackman & Wageman, 2007), others have argued that the field has continued to evolve and is potentially on the threshold of some significant breakthroughs (Day & Antonakis, 2012a). Leadership scholars have been re-energized by new directions in the field and research efforts have revitalized areas previously abandoned for apparent lack of consistency in findings (e.g., leadership trait theory). Our accumulated knowledge now allows us to explain the nature of leadership including its biological bases and other antecedents, and consequences with some degree of confidence. There are other comprehensive sources that review the extensive theoretical and empirical foundation of leadership (Bass, 2008; Day & Antonakis, 2012b) so that will not be the focus of the present chapter. Instead, we will take a future-oriented perspective in identifying particular areas within the leadership field that we believe offer promising perspectives on the future of leadership. Nonetheless, it is worthwhile as background to first provide an overview of how we see the leadership field changing over the past decade or so. This short chronicle will set the stage for a keener understanding of where the future contributions are likely to emerge. Overall, across nine major schools of leadership - trait, behavioural, contingency, contextual, relational, sceptics, information processing, New Leadership, biological and evolutionary - researchers have seen a resurgence in interest in one area, a high level of activity in at least four other areas, inactivity in three areas, and one that was modestly active in the previous decade but we think holds strong promise for the future (Gardner, Lowe, Moss, Mahoney, & Cogliser, 2010). We will next provide brief overviews of these nine schools and their respective levels of research activity (see Figure 1).
Resumo:
THE COMBINATION OF ADVANCED NEUROIMAGING TECHNIQUES AND MAJOR DEVELOPMENTS IN COMPLEX NETWORK SCIENCE, HAVE GIVEN BIRTH TO A NEW FRAMEWORK FOR STUDYING THE BRAIN: "connectomics." This framework provides the ability to describe and study the brain as a dynamic network and to explore how the coordination and integration of information processing may occur. In recent years this framework has been used to investigate the developing brain and has shed light on many dynamic changes occurring from infancy through adulthood. The aim of this article is to review this work and to discuss what we have learned from it. We will also use this body of work to highlight key technical aspects that are necessary in general for successful connectome analysis using today's advanced neuroimaging techniques. We look to identify current limitations of such approaches, what can be improved, and how these points generalize to other topics in connectome research.
Resumo:
Acquisition of a mature dendritic morphology is critical for neural information processing. In particular, hepatocyte growth factor (HGF) controls dendritic arborization during brain development. However, the cellular mechanisms underlying the effects of HGF on dendritic growth remain elusive. Here, we show that HGF increases dendritic length and branching of rat cortical neurons through activation of the mitogen-activated protein kinase (MAPK) signaling pathway. Activation of MAPK by HGF leads to the rapid and transient phosphorylation of cAMP response element-binding protein (CREB), a key step necessary for the control of dendritic development by HGF. In addition to CREB phosphorylation, regulation of dendritic growth by HGF requires the interaction between CREB and CREB-regulated transcription coactivator 1 (CRTC1), as expression of a mutated form of CREB unable to bind CRTC1 completely abolished the effects of HGF on dendritic morphology. Treatment of cortical neurons with HGF in combination with brain-derived neurotrophic factor (BDNF), a member of the neurotrophin family that regulates dendritic development via similar mechanisms, showed additive effects on MAPK activation, CREB phosphorylation and dendritic growth. Collectively, these results support the conclusion that regulation of cortical dendritic morphology by HGF is mediated by activation of the MAPK pathway, phosphorylation of CREB and interaction of CREB with CRTC1.
Resumo:
In this article we propose a novel method for calculating cardiac 3-D strain. The method requires the acquisition of myocardial short-axis (SA) slices only and produces the 3-D strain tensor at every point within every pair of slices. Three-dimensional displacement is calculated from SA slices using zHARP which is then used for calculating the local displacement gradient and thus the local strain tensor. There are three main advantages of this method. First, the 3-D strain tensor is calculated for every pixel without interpolation; this is unprecedented in cardiac MR imaging. Second, this method is fast, in part because there is no need to acquire long-axis (LA) slices. Third, the method is accurate because the 3-D displacement components are acquired simultaneously and therefore reduces motion artifacts without the need for registration. This article presents the theory of computing 3-D strain from two slices using zHARP, the imaging protocol, and both phantom and in-vivo validation.
Resumo:
Graph theory has provided a key mathematical framework to analyse the architecture of human brain networks. This architecture embodies an inherently complex relationship between connection topology, the spatial arrangement of network elements, and the resulting network cost and functional performance. An exploration of these interacting factors and driving forces may reveal salient network features that are critically important for shaping and constraining the brain's topological organization and its evolvability. Several studies have pointed to an economic balance between network cost and network efficiency with networks organized in an 'economical' small-world favouring high communication efficiency at a low wiring cost. In this study, we define and explore a network morphospace in order to characterize different aspects of communication efficiency in human brain networks. Using a multi-objective evolutionary approach that approximates a Pareto-optimal set within the morphospace, we investigate the capacity of anatomical brain networks to evolve towards topologies that exhibit optimal information processing features while preserving network cost. This approach allows us to investigate network topologies that emerge under specific selection pressures, thus providing some insight into the selectional forces that may have shaped the network architecture of existing human brains.
Resumo:
From recent calls for positioning forensic scientists within the criminal justice system, but also policing and intelligence missions, this paper emphasizes the need for the development of educational and training programs in the area of forensic intelligence, It is argued that an imbalance exists between perceived and actual understanding of forensic intelligence by police and forensic science managers, and that this imbalance can only be overcome through education. The challenge for forensic intelligence education and training is therefore to devise programs that increase forensic intelligence awareness, firstly for managers to help prevent poor decisions on how to develop information processing. Two recent European courses are presented as examples of education offerings, along with lessons learned and suggested paths forward. It is concluded that the new focus on forensic intelligence could restore a pro-active approach to forensic science, better quantify its efficiency and let it get more involved in investigative and managerial decisions. A new educational challenge is opened to forensic science university programs around the world: to refocus criminal trace analysis on a more holistic security problem solving approach.
Resumo:
The key information processing units within gene regulatory networks are enhancers. Enhancer activity is associated with the production of tissue-specific noncoding RNAs, yet the existence of such transcripts during cardiac development has not been established. Using an integrated genomic approach, we demonstrate that fetal cardiac enhancers generate long noncoding RNAs (lncRNAs) during cardiac differentiation and morphogenesis. Enhancer expression correlates with the emergence of active enhancer chromatin states, the initiation of RNA polymerase II at enhancer loci and expression of target genes. Orthologous human sequences are also transcribed in fetal human hearts and cardiac progenitor cells. Through a systematic bioinformatic analysis, we identified and characterized, for the first time, a catalog of lncRNAs that are expressed during embryonic stem cell differentiation into cardiomyocytes and associated with active cardiac enhancer sequences. RNA-sequencing demonstrates that many of these transcripts are polyadenylated, multi-exonic long noncoding RNAs. Moreover, knockdown of two enhancer-associated lncRNAs resulted in the specific downregulation of their predicted target genes. Interestingly, the reactivation of the fetal gene program, a hallmark of the stress response in the adult heart, is accompanied by increased expression of fetal cardiac enhancer transcripts. Altogether, these findings demonstrate that the activity of cardiac enhancers and expression of their target genes are associated with the production of enhancer-derived lncRNAs.
Resumo:
BACKGROUND: In a high proportion of patients with favorable outcome after aneurysmal subarachnoid hemorrhage (aSAH), neuropsychological deficits, depression, anxiety, and fatigue are responsible for the inability to return to their regular premorbid life and pursue their professional careers. These problems often remain unrecognized, as no recommendations concerning a standardized comprehensive assessment have yet found entry into clinical routines. METHODS: To establish a nationwide standard concerning a comprehensive assessment after aSAH, representatives of all neuropsychological and neurosurgical departments of those eight Swiss centers treating acute aSAH have agreed on a common protocol. In addition, a battery of questionnaires and neuropsychological tests was selected, optimally suited to the deficits found most prevalent in aSAH patients that was available in different languages and standardized. RESULTS: We propose a baseline inpatient neuropsychological screening using the Montreal Cognitive Assessment (MoCA) between days 14 and 28 after aSAH. In an outpatient setting at 3 and 12 months after bleeding, we recommend a neuropsychological examination, testing all relevant domains including attention, speed of information processing, executive functions, verbal and visual learning/memory, language, visuo-perceptual abilities, and premorbid intelligence. In addition, a detailed assessment capturing anxiety, depression, fatigue, symptoms of frontal lobe affection, and quality of life should be performed. CONCLUSIONS: This standardized neuropsychological assessment will lead to a more comprehensive assessment of the patient, facilitate the detection and subsequent treatment of previously unrecognized but relevant impairments, and help to determine the incidence, characteristics, modifiable risk factors, and the clinical course of these impairments after aSAH.
Resumo:
Regulation has in many cases been delegated to independent agencies, which has led to the question of how democratic accountability of these agencies is ensured. There are few empirical approaches to agency accountability. We offer such an approach, resting upon three propositions. First, we scrutinize agency accountability both de jure (accountability is ensured by formal rights of accountability 'fora' to receive information and impose consequences) and de facto (the capability of fora to use these rights depends on resources and decision costs that affect the credibility of their sanctioning capacity). Second, accountability must be evaluated separately at political, operational and managerial levels. And third, at each level accountability is enacted by a system of several (partially) interdependent fora, forming together an accountability regime. The proposed framework is applied to the case of the German Bundesnetzagentur's accountability regime, which shows its suitability for empirical purposes. Regulatory agencies are often considered as independent, yet accountable. This article provides a realistic framework for the study of accountability 'regimes' in which they are embedded. It emphasizes the need to identify the various actors (accountability fora) to which agencies are formally accountable (parliamentary committees, auditing bodies, courts, and so on) and to consider possible relationships between them. It argues that formal accountability 'on paper', as defined in official documents, does not fully account for de facto accountability, which depends on the resources possessed by the fora (mainly information-processing and decision-making capacities) and the credibility of their sanctioning capacities. The article applies this framework to the German Bundesnetzagentur.
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.
Resumo:
The adult hippocampus generates functional dentate granule cells (GCs) that release glutamate onto target cells in the hilus and cornus ammonis (CA)3 region, and receive glutamatergic and γ-aminobutyric acid (GABA)ergic inputs that tightly control their spiking activity. The slow and sequential development of their excitatory and inhibitory inputs makes them particularly relevant for information processing. Although they are still immature, new neurons are recruited by afferent activity and display increased excitability, enhanced activity-dependent plasticity of their input and output connections, and a high rate of synaptogenesis. Once fully mature, new GCs show all the hallmarks of neurons generated during development. In this review, we focus on how developing neurons remodel the adult dentate gyrus and discuss key aspects that illustrate the potential of neurogenesis as a mechanism for circuit plasticity and function.