908 resultados para ACTOR-NETWORK THEORY
Resumo:
The thesis investigates if with the free news production, people who post information on collaborative content sites, known as interacting, tend to reproduce information that was scheduled for Tv news. This study is a comparison of the collaborative content vehicles Vc reporter, Vc no G1 and Eu reporter with TV news SBT Brasil, Jornal Nacional, Jornal da Record and Jornal da Band. We sought to determine whether those newscasts guide the collaborative platforms. The hypothesis assumes that Brazilian TV news have been building over time a credible relationship with the viewer, so it is possible to think that the interacting use the same criteria for selecting the broadcasts and reproduce similar information in collaborative content sites. The method used was content analysis, based on the study of Laurence Bardin and the type of research used was quantitative. This research concluded that, within a small portion of the universe surveyed, there are schedules of television news across the collaborative content.
Resumo:
The Biomolecular Interaction Network Database (BIND; http://binddb.org) is a database designed to store full descriptions of interactions, molecular complexes and pathways. Development of the BIND 2.0 data model has led to the incorporation of virtually all components of molecular mechanisms including interactions between any two molecules composed of proteins, nucleic acids and small molecules. Chemical reactions, photochemical activation and conformational changes can also be described. Everything from small molecule biochemistry to signal transduction is abstracted in such a way that graph theory methods may be applied for data mining. The database can be used to study networks of interactions, to map pathways across taxonomic branches and to generate information for kinetic simulations. BIND anticipates the coming large influx of interaction information from high-throughput proteomics efforts including detailed information about post-translational modifications from mass spectrometry. Version 2.0 of the BIND data model is discussed as well as implementation, content and the open nature of the BIND project. The BIND data specification is available as ASN.1 and XML DTD.
Resumo:
Visual classification is the way we relate to different images in our environment as if they were the same, while relating differently to other collections of stimuli (e.g., human vs. animal faces). It is still not clear, however, how the brain forms such classes, especially when introduced with new or changing environments. To isolate a perception-based mechanism underlying class representation, we studied unsupervised classification of an incoming stream of simple images. Classification patterns were clearly affected by stimulus frequency distribution, although subjects were unaware of this distribution. There was a common bias to locate class centers near the most frequent stimuli and their boundaries near the least frequent stimuli. Responses were also faster for more frequent stimuli. Using a minimal, biologically based neural-network model, we demonstrate that a simple, self-organizing representation mechanism based on overlapping tuning curves and slow Hebbian learning suffices to ensure classification. Combined behavioral and theoretical results predict large tuning overlap, implicating posterior infero-temporal cortex as a possible site of classification.
Resumo:
The role of intrinsic cortical connections in processing sensory input and in generating behavioral output is poorly understood. We have examined this issue in the context of the tuning of neuronal responses in cortex to the orientation of a visual stimulus. We analytically study a simple network model that incorporates both orientation-selective input from the lateral geniculate nucleus and orientation-specific cortical interactions. Depending on the model parameters, the network exhibits orientation selectivity that originates from within the cortex, by a symmetry-breaking mechanism. In this case, the width of the orientation tuning can be sharp even if the lateral geniculate nucleus inputs are only weakly anisotropic. By using our model, several experimental consequences of this cortical mechanism of orientation tuning are derived. The tuning width is relatively independent of the contrast and angular anisotropy of the visual stimulus. The transient population response to changing of the stimulus orientation exhibits a slow "virtual rotation." Neuronal cross-correlations exhibit long time tails, the sign of which depends on the preferred orientations of the cells and the stimulus orientation.
Resumo:
This project attempts to answer the question "What holds the construction of money together?" by asserting that it is money's religious nature which provides the moral compulsion for people to use, and continue to uphold, money as a socially constructed concept. This project is primarily descriptive and focuses on the religious nature of money by employing a sociological theory of religion in viewing money as a technical concept. This is an interdisciplinary work between religious studies, economics, and sociology and draws heavily from Emile Durkheim's 'The Elementary Forms of Religious Life' as well as work related to heterodox theories of money developed by Geoffrey Ingham, A. Mitchell Innes, and David Graeber. Two new concepts are developed: the idea of monetary sacrality and monetary effervescence, both of which serve to recharge the religious saliency of money. By developing the concept of monetary sacrality, this project shows how money acts to interpret our economic relations while also obfuscating complex power dynamics in society, making them seem naturally occurring and unchangeable. The project also shows how our contemporary fractional reserve banking system contributes to money's collective effervescence and serves to animate economic acting within a monetary network. The project concludes by outlining multiple implications for religious studies, economics, sociology, and central banking.
Resumo:
The groundbreaking scope of the Economic Partnership Agreement (EPA) between the European Union (EU) and Cariforum (CF) irrefutably marks a substantive shift in trade relations between the regions and also has far-reaching implications across several sectors and levels. Supplementing the framework of analysis of Structural Foreign Policy (SFP) with neo-Gramscian theory allows for a thorough investigation into the details of structural embeddedness based on the EU's historic directionality towards the Caribbean region; notably, encouraging integration into the global capitalist economy by adapting to and adopting the ideals of neoliberal economics. Whilst the Caribbean – as the first and only signatory of a ‘full’ EPA – may be considered the case par excellence of the success of the EPAs, this paper demonstrates that there is no cause-effect relationship between the singular case of the ‘full’ CF-EU EPA and the success of the EPA policy towards the ACP in general. The research detailed throughout this paper responds to two SFP-based questions: (1) To what extent is the EPA a SFP tool aimed at influencing and shaping the structures in the Caribbean? (2) To what extent is the internalisation of this process reflective of the EU as a hegemonic SFP actor vis-à-vis the Caribbean? This paper affirms both the role of the EU as a hegemonic SFP actor and the EPA as a hegemonic SFP tool. Research into the negotiation, agreement and controversy that surrounds every stage of the EPA confirmed that through modern diplomacy and an evolution in relations, consensus is at the fore of contemporary EU-Caribbean relations. Whilst at once dealing with the singular case of the Caribbean, the author offers a nuanced approach beyond 'EU navel-gazing' by incorporating an ‘outside-in’ perspective, which thereafter could be applied to EU-ACP relations and the North-South dialogue in general.
Resumo:
Network governance of collective learning processes is an essential approach to sustainable development. The first section of the article briefly refers to recent theories about both market and government failures that express scepticism about the way framework conditions for market actors are set. For this reason, the development of networks for collective learning processes seems advantageous if new solutions are to be developed in policy areas concerned with long-term changes and a stepwise internalisation of externalities. With regard to corporate actors’ interests, the article shows recent insights from theories about the knowledge-based firm, where the creation of new knowledge is based on the absorption of societal views. This concept shifts the focus towards knowledge generation as an essential element in the evolution of sustainable markets. This involves at the same time the development of new policies. In this context innovation-inducing regulation is suggested and discussed. The evolution of the Swedish, German and Dutch wind turbine industries are analysed based on the approach of governance put forward in this article. We conclude that these coevolutionary mechanisms may take for granted some of the stabilising and orientating functions previously exercised by basic regulatory activities of the state. In this context, the main function of the governments is to facilitate learning processes that depart from the government functions suggested by welfare economics.
Resumo:
Vita.
Resumo:
"Grant no. US NSF MCS75-21758."
Resumo:
Bibliography: p. 25-28.
Resumo:
"Supported in part by the National Science Foundation under grant no. NSF GJ 28289."
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
In this paper, we present the results of the prediction of the high-pressure adsorption equilibrium of supercritical. gases (Ar, N-2, CH4, and CO2) on various activated carbons (BPL, PCB, and Norit R1 extra) at various temperatures using a density-functional-theory-based finite wall thickness (FWT) model. Pore size distribution results of the carbons are taken from our recent previous work 1,2 using this approach for characterization. To validate the model, isotherms calculated from the density functional theory (DFT) approach are comprehensively verified against those determined by grand canonical Monte Carlo (GCMC) simulation, before the theoretical adsorption isotherms of these investigated carbons calculated by the model are compared with the experimental adsorption measurements of the carbons. We illustrate the accuracy and consistency of the FWT model for the prediction of adsorption isotherms of the all investigated gases. The pore network connectivity problem occurring in the examined carbons is also discussed, and on the basis of the success of the predictions assuming a similar pore size distribution for accessible and inaccessible regions, it is suggested that this is largely related to the disordered nature of the carbon.
Resumo:
The theoretical impacts of anthropogenic habitat degradation on genetic resources have been well articulated. Here we use a simulation approach to assess the magnitude of expected genetic change, and review 31 studies of 23 neotropical tree species to assess whether empirical case studies conform to theory. Major differences in the sensitivity of measures to detect the genetic health of degraded populations were obvious. Most studies employing genetic diversity (nine out of 13) found no significant consequences, yet most that assessed progeny inbreeding (six out of eight), reproductive output (seven out of 10) and fitness (all six) highlighted significant impacts. These observations are in line with theory, where inbreeding is observed immediately following impact, but genetic diversity is lost slowly over subsequent generations, which for trees may take decades. Studies also highlight the ecological, not just genetic, consequences of habitat degradation that can cause reduced seed set and progeny fitness. Unexpectedly, two studies examining pollen flow using paternity analysis highlight an extensive network of gene flow at smaller spatial scales (less than 10 km). Gene flow can thus mitigate against loss of genetic diversity and assist in long-term population viability, even in degraded landscapes. Unfortunately, the surveyed studies were too few and heterogeneous to examine concepts of population size thresholds and genetic resilience in relation to life history. Future suggested research priorities include undertaking integrated studies on a range of species in the same landscapes; better documentation of the extent and duration of impact; and most importantly, combining neutral marker, pollination dynamics, ecological consequences, and progeny fitness assessment within single studies.
Resumo:
Consider a network of unreliable links, modelling for example a communication network. Estimating the reliability of the network-expressed as the probability that certain nodes in the network are connected-is a computationally difficult task. In this paper we study how the Cross-Entropy method can be used to obtain more efficient network reliability estimation procedures. Three techniques of estimation are considered: Crude Monte Carlo and the more sophisticated Permutation Monte Carlo and Merge Process. We show that the Cross-Entropy method yields a speed-up over all three techniques.