862 resultados para Information and communication
Resumo:
Brain function is critically dependent on the ionic homeostasis in both the extra- and intracellular compartment. The regulation of brain extracellular ionic composition mainly relies on active transport at blood brain and at blood cerebrospinal fluid interfaces whereas intracellular ion regulation is based on plasmalemmal transporters of neurons and glia. In addition, the latter mechanisms can generate physiologically as well as pathophysiologically significant extracellular ion transients. In this work I have studied molecular mechanisms and development of ion regulation and how these factors alter neuronal excitability and affect synaptic and non-synaptic transmission with a particular emphasis on intracellular pH and chloride (Cl-) regulation. Why is the regulation of acid-base equivalents (H+ and HCO3-) and Cl- of such interest and importance? First of all, GABAA-receptors are permeable to both HCO3- and Cl-. In the adult mammalian central nervous system (CNS) fast postsynaptic inhibition relies on GABAA-receptor mediated transmission. Today, excitatory effects of GABAA-receptors, both in mature neurons and during the early development, have been recognized and the significance of the dual actions of GABA on neuronal communication has become an interesting field of research. The transmembrane gradients of Cl- and HCO3- determine the reversal potential of GABAA-receptor mediated postsynaptic potentials and hence, the function of pH and Cl- regulatory proteins have profound consequences on GABAergic signaling and neuronal excitability. Secondly, perturbations in pH can cause a variety of changes in cellular function, many of them resulting from the interaction of protons with ionizable side chains of proteins. pH-mediated alterations of protein conformation in e.g. ion channels, transporters, and enzymes can powerfully modulate neurotransmission. In the context of pH homeostasis, the enzyme carbonic anhydrase (CA) needs to be taken into account in parallel with ion transporters: for CO2/HCO3- buffering to act in a fast manner, CO2 (de)hydration must be catalyzed by this enzyme. The acid-base equivalents that serve as substrates in the CO2 dehydration-hydration reaction are also engaged in many carrier and channel mediated ion movements. In such processes, CA activity is in key position to modulate transmembrane solute fluxes and their consequences. The bicarbonate transporters (BTs; SLC4) and the electroneutral cation-chloride cotransporters (CCCs; SLC12) belong the to large gene family of solute carriers (SLCs). In my work I have studied the physiological roles of the K+-Cl- cotransporter KCC2 (Slc12a5) and the Na+-driven Cl--HCO3- exchanger NCBE (Slc4a10) and the roles of these two ion transporters in the modualtion of neuronal communication and excitability in the rodent hippocampus. I have also examined the cellular localization and molecular basis of intracellular CA that has been shown to be essential for the generation of prolonged GABAergic excitation in the mature hippocampus. The results in my Thesis provide direct evidence for the view that the postnatal up-regulation of KCC2 accounts for the developmental shift from depolarizing to hyperpolarizing postsynaptic EGABA-A responses in rat hippocampal pyramidal neurons. The results also indicate that after KCC2 expression the developmental onset of excitatory GABAergic transmission upon intense GABAA-receptor stimulation depend on the expression of intrapyramidal CA, identified as the CA isoform VII. Studies on mice with targeted Slc4a10 gene disruption revealed an important role for NCBE in neuronal pH regulation and in pH-dependent modulation of neuronal excitability. Furthermore, this ion transporter is involved in the basolateral Na+ and HCO3- uptake in choroid plexus epithelial cells, and is thus likely to contribute to cerebrospinal fluid production.
Resumo:
With the level of digital disruption that is affecting businesses around the globe, you might expect high levels of Governance of Enterprise Information and Technology (GEIT) capability within boards. Boards and their senior executives know technology is important. More than 90% of boards and senior executives currently identify technology as essential to their current businesses, and to their organization’s future. But as few as 16% have sufficient GEIT capability. Global Centre for Digital Business Transformation’s recent research contains strong indicators of the need for change. Despite board awareness of both the likelihood and impact of digital disruption, things digital are still not viewed as a board-level matter in 45% of companies. And, it’s not just the board. The lack of board attention to technology can be mirrored at senior executive level as well. When asked about their organization’s attitude towards digital disruption, 43% of executives said their business either did not recognise it as a priority or was not responding appropriately. A further 32% were taking a “follower” approach, a potentially risky move as we will explain. Given all the evidence that boards know information and technology (I&T***) is vital, that they understand the inevitably, impact and speed of digital change and disruption, why are so many boards dragging their heels? Ignoring I&T disruption and refusing to build capability at board level is nothing short of negligence. Too many boards risk flying blind without GEIT capability [2]. To help build decision quality and I&T governance capability, this research: • Confirms a pressing need to build individual competency and cumulative, across-board capability in governing I&T • Identifies six factors that have rapidly increased the need, risk and urgency • Finds that boards may risk not meeting their duty of care responsibilities when it comes to I&T oversight • Highlights barriers to building capability details three GEIT competencies that boards and executives can use for evaluation, selection, recruitment and professional development.
Resumo:
This study offers a reconstruction and critical evaluation of globalization theory, a perspective that has been central for sociology and cultural studies in recent decades, from the viewpoint of media and communications. As the study shows, sociological and cultural globalization theorists rely heavily on arguments concerning media and communications, especially the so-called new information and communication technologies, in the construction of their frameworks. Together with deepening the understanding of globalization theory, the study gives new critical knowledge of the problematic consequences that follow from such strong investment in media and communications in contemporary theory. The book is divided into four parts. The first part presents the research problem, the approach and the theoretical contexts of the study. Followed by the introduction in Chapter 1, I identify the core elements of globalization theory in Chapter 2. At the heart of globalization theory is the claim that recent decades have witnessed massive changes in the spatio-temporal constitution of society, caused by new media and communications in particular, and that these changes necessitate the rethinking of the foundations of social theory as a whole. Chapter 3 introduces three paradigms of media research the political economy of media, cultural studies and medium theory the discussion of which will make it easier to understand the key issues and controversies that emerge in academic globalization theorists treatment of media and communications. The next two parts offer a close reading of four theorists whose works I use as entry points into academic debates on globalization. I argue that we can make sense of mainstream positions on globalization by dividing them into two paradigms: on the one hand, media-technological explanations of globalization and, on the other, cultural globalization theory. As examples of the former, I discuss the works of Manuel Castells (Chapter 4) and Scott Lash (Chapter 5). I maintain that their analyses of globalization processes are overtly media-centric and result in an unhistorical and uncritical understanding of social power in an era of capitalist globalization. A related evaluation of the second paradigm (cultural globalization theory), as exemplified by Arjun Appadurai and John Tomlinson, is presented in Chapter 6. I argue that due to their rejection of the importance of nation states and the notion of cultural imperialism for cultural analysis, and their replacement with a framework of media-generated deterritorializations and flows, these theorists underplay the importance of the neoliberalization of cultures throughout the world. The fourth part (Chapter 7) presents a central research finding of this study, namely that the media-centrism of globalization theory can be understood in the context of the emergence of neoliberalism. I find it problematic that at the same time when capitalist dynamics have been strengthened in social and cultural life, advocates of globalization theory have directed attention to media-technological changes and their sweeping socio-cultural consequences, instead of analyzing the powerful material forces that shape the society and the culture. I further argue that this shift serves not only analytical but also utopian functions, that is, the longing for a better world in times when such longing is otherwise considered impracticable.
Resumo:
The thesis examines the intensification and characteristics of a policy that emphasises economic competitiveness in Finland during the 1990s and early 2000s. This accentuation of economic objectives is studied at the level of national policy-making as well as at the regional level through the policies and strategies of cities and three universities in the Helsinki region. By combining the analysis of state policies, urban strategies and university activities, the study illustrates the pervasiveness of the objective of economic competitiveness and growth across these levels and sheds light on the features and contradictions of these policies on a broad scale. The thesis is composed of five research articles and a summary article. At the level of national policies, the central focus of the thesis is on the growing role of science and technology policy as a state means to promote structural economic change and its transformation towards a broader, yet ambivalent concept of innovation policy. This shift brings forward a tension between an increasing emphasis on economic aspects – innovations and competitiveness – as well as the expanding scope of issues across a wide range of policy sectors that are being subsumed under this market- and economy oriented framework. Related to science and technology policy, attention is paid to adjustments in university policy in which there has been increasing pressure for efficiency, rationalisation and commercialisation of academic activities. Furthermore, political efforts to build an information society through the application of information and communication technologies are analysed with particular attention to the balance between economic and social objectives. Finally, changes in state regional policy priorities and the tendency towards competitiveness are addressed. At the regional level, the focus of the thesis is on the policies of the cities in Finland’s capital region as well as strategies of three universities operating in the region, namely the University of Helsinki, Helsinki University of technology and Helsinki School of Economics. As regards the urban level, the main focus is on the changes and characteristics of the urban economic development policy of the City of Helsinki. With respect to the universities, the thesis examines their attempts to commercialise research and thus bring academic research closer to economic interests, and pays particular attention to the contradictions of commercialisation. Related to the universities, the activities of three intermediary organisations that the universities have established in order to increase cooperation with industry are analysed. These organisations are the Helsinki Science Park, Otaniemi International Innovation Centre and LTT Research Ltd. The summary article provides a synthesis of the material presented in the five original articles and relates the results of the articles to a broader discussion concerning the emergence of competition states and entrepreneurial cities and regions. The main points of reference are Bob Jessop’s and Neil Brenner’s theses on state and urban-regional restructuring. The empirical results and considerations from Finland and the Helsinki region are used to comment on, specify and criticise specific parts of the two theses.
Resumo:
This report derives from the EU funded research project “Key Factors Influencing Economic Relationships and Communication in European Food Chains” (FOODCOMM). The research consortium consisted of the following organisations: University of Bonn (UNI BONN), Department of Agricultural and Food Marketing Research (overall project co-ordination); Institute of Agricultural Development in Central and Eastern Europe (IAMO), Department for Agricultural Markets, Marketing and World Agricultural Trade, Halle (Saale), Germany; University of Helsinki, Ruralia Institute Seinäjoki Unit, Finland; Scottish Agricultural College (SAC), Food Marketing Research Team - Land Economy Research Group, Edinburgh and Aberdeen; Ashtown Food Research Centre (AFRC), Teagasc, Food Marketing Unit, Dublin; Institute of Agricultural & Food Economics (IAFE), Department of Market Analysis and Food Processing, Warsaw and Government of Aragon, Center for Agro-Food Research and Technology (CITA), Zaragoza, Spain. The aim of the FOODCOMM project was to examine the role (prevalence, necessity and significance) of economic relationships in selected European food chains and to identify the economic, social and cultural factors which influence co-ordination within these chains. The research project considered meat and cereal commodities in six different European countries (Finland, Germany, Ireland, Poland, Spain, UK/Scotland) and was commissioned against a background of changing European food markets. The research project as a whole consisted of seven different work packages. This report presents the results of qualitative research conducted for work package 5 (WP5) in the pig meat and rye bread chains in Finland. Ruralia Institute would like to give special thanks for all the individuals and companies that kindly gave up their time to take part in the study. Their input has been invaluable to the project. The contribution of research assistant Sanna-Helena Rantala was significant in the data gathering. FOODCOMM project was coordinated by the University of Bonn, Department of Agricultural and Food Market Research. Special thanks especially to Professor Monika Hartmann for acting as the project leader of FOODCOMM.
Resumo:
This thesis explores the relationship between humans and ICTs (information and communication technologies). As ICTs are increasingly penetrating all spheres of social life, their role as mediators – between people, between people and information, and even between people and the natural world – is expanding, and they are increasingly shaping social life. Yet, we still know little of how our life is affected by their growing role. Our understanding of the actors and forces driving the accelerating adoption of new ICTs in all areas of life is also fairly limited. This thesis addresses these problems by interpretively exploring the link between ICTs and the shaping of society at home, in the office, and in the community. The thesis builds on empirical material gathered in three research projects, presented in four separate essays. The first project explores computerized office work through a case study. The second is a regional development project aiming at increasing ICT knowledge and use in 50 small-town families. In the third, the second project is compared to three other longitudinal development projects funded by the European Union. Using theories that consider the human-ICT relationship as intertwined, the thesis provides a multifaceted description of life with ICTs in contemporary information society. By oscillating between empirical and theoretical investigations and balancing between determinist and constructivist conceptualisations of the human-ICT relationship, I construct a dialectical theoretical framework that can be used for studying socio-technical contexts in society. This framework helps us see how societal change stems from the complex social processes that surround routine everyday actions. For example, interacting with and through ICTs may change individuals’ perceptions of time and space, social roles, and the proper ways to communicate – changes which at some point in time result in societal change in terms of, for example, new ways of acting and knowing things.
Resumo:
Production scheduling in a flexible manufacturing system (FMS) is a real-time combinatorial optimization problem that has been proved to be NP-complete. Solving this problem needs on-line monitoring of plan execution and requires real-time decision-making in selecting alternative routings, assigning required resources, and rescheduling when failures occur in the system. Expert systems provide a natural framework for solving this kind of NP-complete problems.In this paper an expert system with a novel parallel heuristic approach is implemented for automatic short-term dynamic scheduling of FMS. The principal features of the expert system presented in this paper include easy rescheduling, on-line plan execution, load balancing, an on-line garbage collection process, and the use of advanced knowledge representational schemes. Its effectiveness is demonstrated with two examples.
Resumo:
The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space view-point is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces f(s) and f(g) and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating f(s) and f(g) is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication-complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. Extensions to the multi-party case is straightforward and is briefly discussed. The average case CC of the relevant greaterthan (CT) function is characterized within two bits. Under the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm. 2010 Elsevier B.V. All rights reserved.
Resumo:
Current scientific research is characterized by increasing specialization, accumulating knowledge at a high speed due to parallel advances in a multitude of sub-disciplines. Recent estimates suggest that human knowledge doubles every two to three years – and with the advances in information and communication technologies, this wide body of scientific knowledge is available to anyone, anywhere, anytime. This may also be referred to as ambient intelligence – an environment characterized by plentiful and available knowledge. The bottleneck in utilizing this knowledge for specific applications is not accessing but assimilating the information and transforming it to suit the needs for a specific application. The increasingly specialized areas of scientific research often have the common goal of converting data into insight allowing the identification of solutions to scientific problems. Due to this common goal, there are strong parallels between different areas of applications that can be exploited and used to cross-fertilize different disciplines. For example, the same fundamental statistical methods are used extensively in speech and language processing, in materials science applications, in visual processing and in biomedicine. Each sub-discipline has found its own specialized methodologies making these statistical methods successful to the given application. The unification of specialized areas is possible because many different problems can share strong analogies, making the theories developed for one problem applicable to other areas of research. It is the goal of this paper to demonstrate the utility of merging two disparate areas of applications to advance scientific research. The merging process requires cross-disciplinary collaboration to allow maximal exploitation of advances in one sub-discipline for that of another. We will demonstrate this general concept with the specific example of merging language technologies and computational biology.
Resumo:
This paper considers the problem of identifying the footprints of communication of multiple transmitters in a given geographical area. To do this, a number of sensors are deployed at arbitrary but known locations in the area, and their individual decisions regarding the presence or absence of the transmitters' signal are combined at a fusion center to reconstruct the spatial spectral usage map. One straightforward scheme to construct this map is to query each of the sensors and cluster the sensors that detect the primary's signal. However, using the fact that a typical transmitter footprint map is a sparse image, two novel compressive sensing based schemes are proposed, which require significantly fewer number of transmissions compared to the querying scheme. A key feature of the proposed schemes is that the measurement matrix is constructed from a pseudo-random binary phase shift applied to the decision of each sensor prior to transmission. The measurement matrix is thus a binary ensemble which satisfies the restricted isometry property. The number of measurements needed for accurate footprint reconstruction is determined using compressive sampling theory. The three schemes are compared through simulations in terms of a performance measure that quantifies the accuracy of the reconstructed spatial spectral usage map. It is found that the proposed sparse reconstruction technique-based schemes significantly outperform the round-robin scheme.
Resumo:
Assembly is an important part of the product development process. To avoid potential issues during assembly in specialized domains such as aircraft assembly, expert knowledge to predict such issues is helpful. Knowledge based systems can act as virtual experts to provide assistance. Knowledge acquisition for such systems however, is a challenge, and this paper describes one part of an ongoing research to acquire knowledge through a dialog between an expert and a knowledge acquisition system. In particular this paper discusses the use of a situation model for assemblies to present experts with a virtual assembly and help them locate the specific context of the knowledge they provide to the system.
Resumo:
In systems biology, questions concerning the molecular and cellular makeup of an organism are of utmost importance, especially when trying to understand how unreliable components-like genetic circuits, biochemical cascades, and ion channels, among others-enable reliable and adaptive behaviour. The repertoire and speed of biological computations are limited by thermodynamic or metabolic constraints: an example can be found in neurons, where fluctuations in biophysical states limit the information they can encode-with almost 20-60% of the total energy allocated for the brain used for signalling purposes, either via action potentials or by synaptic transmission. Here, we consider the imperatives for neurons to optimise computational and metabolic efficiency, wherein benefits and costs trade-off against each other in the context of self-organised and adaptive behaviour. In particular, we try to link information theoretic (variational) and thermodynamic (Helmholtz) free-energy formulations of neuronal processing and show how they are related in a fundamental way through a complexity minimisation lemma.
Resumo:
In this paper, we present a machine learning approach for subject independent human action recognition using depth camera, emphasizing the importance of depth in recognition of actions. The proposed approach uses the flow information of all 3 dimensions to classify an action. In our approach, we have obtained the 2-D optical flow and used it along with the depth image to obtain the depth flow (Z motion vectors). The obtained flow captures the dynamics of the actions in space time. Feature vectors are obtained by averaging the 3-D motion over a grid laid over the silhouette in a hierarchical fashion. These hierarchical fine to coarse windows capture the motion dynamics of the object at various scales. The extracted features are used to train a Meta-cognitive Radial Basis Function Network (McRBFN) that uses a Projection Based Learning (PBL) algorithm, referred to as PBL-McRBFN, henceforth. PBL-McRBFN begins with zero hidden neurons and builds the network based on the best human learning strategy, namely, self-regulated learning in a meta-cognitive environment. When a sample is used for learning, PBLMcRBFN uses the sample overlapping conditions, and a projection based learning algorithm to estimate the parameters of the network. The performance of PBL-McRBFN is compared to that of a Support Vector Machine (SVM) and Extreme Learning Machine (ELM) classifiers with representation of every person and action in the training and testing datasets. Performance study shows that PBL-McRBFN outperforms these classifiers in recognizing actions in 3-D. Further, a subject-independent study is conducted by leave-one-subject-out strategy and its generalization performance is tested. It is observed from the subject-independent study that McRBFN is capable of generalizing actions accurately. The performance of the proposed approach is benchmarked with Video Analytics Lab (VAL) dataset and Berkeley Multimodal Human Action Database (MHAD). (C) 2013 Elsevier Ltd. All rights reserved.