898 resultados para disordered systems (theory)
Resumo:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
Resumo:
We introduce a general matrix formulation for multiuser channels and analyse the special cases of Multiple-Input Multiple-Output channels, channels with interference and relay arrays under LDPC coding using methods developed for the statistical mechanics of disordered systems. We use the replica method to provide results for the typical overlaps of the original and recovered messages and discuss their implications. The results obtained are consistent with belief propagation and density evolution results but also complement them giving additional insights into the information dynamics of these channels with unexpected effects in some cases.
Resumo:
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. © 2013 IOP Publishing Ltd.
Resumo:
Optimizing paths on networks is crucial for many applications, ranging from subway traffic to Internet communication. Because global path optimization that takes account of all path choices simultaneously is computationally hard, most existing routing algorithms optimize paths individually, thus providing suboptimal solutions. We use the physics of interacting polymers and disordered systems to analyze macroscopic properties of generic path optimization problems and derive a simple, principled, generic, and distributed routing algorithm capable of considering all individual path choices simultaneously. We demonstrate the efficacy of the algorithm by applying it to: (i) random graphs resembling Internet overlay networks, (ii) travel on the London Underground network based on Oyster card data, and (iii ) the global airport network. Analytically derived macroscopic properties give rise to insightful new routing phenomena, including phase transitions and scaling laws, that facilitate better understanding of the appropriate operational regimes and their limitations, which are difficult to obtain otherwise.
Resumo:
Risk and knowledge are two concepts and components of business management which have so far been studied almost independently. This is especially true where risk management is conceived mainly in financial terms, as, for example, in the banking sector. The banking sector has sophisticated methodologies for managing risk, such as mathematical risk modeling. However. the methodologies for analyzing risk do not explicitly include knowledge management for risk knowledge creation and risk knowledge transfer. Banks are affected by internal and external changes with the consequent accommodation to new business models new regulations and the competition of big players around the world. Thus, banks have different levels of risk appetite and policies in risk management. This paper takes into consideration that business models are changing and that management is looking across the organization to identify the influence of strategic planning, information systems theory, risk management and knowledge management. These disciplines can handle the risks affecting banking that arise from different areas, but only if they work together. This creates a need to view them in an integrated way. This article sees enterprise risk management as a specific application of knowledge in order to control deviation from strategic objectives, shareholders' values and stakeholders' relationships. Before and after a modeling process it necessary to find insights into how the application of knowledge management processes can improve the understanding of risk and the implementation of enterprise risk management. The article presents a propose methodology to contribute to providing a guide for developing risk modeling knowledge and a reduction of knowledge silos, in order to improve the quality and quantity of solutions related to risk inquiries across the organization.
Resumo:
The problem of strongly correlated electrons in one dimension attracted attention of condensed matter physicists since early 50’s. After the seminal paper of Tomonaga [1] who suggested the first soluble model in 1950, there were essential achievements reflected in papers by Luttinger [2] (1963) and Mattis and Lieb [3] (1963). A considerable contribution to the understanding of generic properties of the 1D electron liquid has been made by Dzyaloshinskii and Larkin [4] (1973) and Efetov and Larkin [5] (1976). Despite the fact that the main features of the 1D electron liquid were captured and described by the end of 70’s, the investigators felt dissatisfied with the rigour of the theoretical description. The most famous example is the paper by Haldane [6] (1981) where the author developed the fundamentals of a modern bosonisation technique, known as the operator approach. This paper became famous because the author has rigourously shown how to construct the Fermi creation/anihilation operators out of the Bose ones. The most recent example of such a dissatisfaction is the review by von Delft and Schoeller [7] (1998) who revised the approach to the bosonisation and came up with what they called constructive bosonisation.
Resumo:
There are a great deal of approaches in artificial intelligence, some of them also coming from biology and neirophysiology. In this paper we are making a review, discussing many of them, and arranging our discussion around the autonomous agent research. We highlight three aspect in our classification: type of abstraction applied for representing agent knowledge, the implementation of hypothesis processing mechanism, allowed degree of freedom in behaviour and self-organizing. Using this classification many approaches in artificial intelligence are evaluated. Then we summarize all discussed ideas and propose a series of general principles for building an autonomous adaptive agent.
Resumo:
Sustainable development support, balanced scorecard development and business process modeling are viewed from the position of systemology. Extensional, intentional and potential properties of a system are considered as necessary to satisfy functional requirements of a meta-system. The correspondence between extensional, intentional and potential properties of a system and sustainable, unsustainable, crisis and catastrophic states of a system is determined. The inaccessibility cause of the system mission is uncovered. The correspondence between extensional, intentional and potential properties of a system and balanced scorecard perspectives is showed. The IDEF0 function modeling method is checked against balanced scorecard perspectives. The correspondence between balanced scorecard perspectives and IDEF0 notations is considered.
Resumo:
Mathematics Subject Classification: 26A33, 93C83, 93C85, 68T40
Resumo:
Mid-Sized Businesses (MSBs) are defined by the Department for Business, Innovation and Skills (BIS) as having a sales turnover of between £25 million and £500 million. A key gap in family firm/business research and literature to date is, understanding the role and importance of non-financial objectives (such as family harmony, tradition and business longevity), and the role the family plays in creating a wide set of business performance objectives (both financial and non-financial) in these businesses. This dissertation contributes to filling this knowledge gap by drawing on Family Systems Theory applied in a business context, and within an overarching Resource Based View (RBV) of the firm.
Resumo:
The current U.S. health care system faces numerous environmental challenges. To compete and survive, health care organizations are developing strategies to lower costs and increase efficiency and quality. All of these strategies require rapid and precise decision making by top level managers. The purpose of this study is to determine the relationship between the environment, made up of unfavorable market conditions and limited resources, and the work roles of top level managers, specifically in the settings of academic medical centers. Managerial work roles are based on the ten work roles developed by Henry Mintzberg, in his book, The Nature of Managerial Work (1973). ^ This research utilized an integrated conceptual framework made up of systems theory in conjunction with role, attribution and contingency theories to illustrate that four most frequently performed Mintzberg's work roles are affected by the two environment dimensions. The study sample consisted of 108 chief executive officers in academic medical centers throughout the United States. The methods included qualitative methods in the form of key informants and case studies and quantitative in the form of a survey questionnaire. Research analysis involved descriptive statistics, reliability tests, correlation, principal component and multivariate analyses. ^ Results indicated that under the market condition of increased revenue based on capitation, the work roles increased. In addition, under the environment dimension of limited resources, the work roles increased when uncompensated care increased while Medicare and non-government funding decreased. ^ Based on these results, a typology of health care managers in academic medical centers was created. Managers could be typed as a strategy-formulator, relationship-builder or task delegator. Therefore, managers who ascertained their types would be able to use this knowledge to build their strengths and develop their weaknesses. Furthermore, organizations could use the typology to identify appropriate roles and responsibilities of managers for their specific needs. Consequently, this research is a valuable tool for understanding health care managerial behaviors that lead to improved decision making. At the same time, this could enhance satisfaction and performance and enable organizations to gain the competitive edge . ^
Resumo:
This dissertation examines the sociological process of conflict resolution and consensus building in South Florida Everglades Ecosystem Restoration through what I define as a Network Management Coordinative Interstitial Group (NetMIG). The process of conflict resolution can be summarized as the participation of interested and affected parties (stakeholders) in a forum of negotiation. I study the case of the Governor's Commission for a Sustainable South Florida (GCSSF) that was established to reduce social conflict. Such conflict originated from environmental disputes about the Everglades and was manifested in the form of gridlock among regulatory (government) agencies, Indian tribes, as well as agricultural, environmental conservationist and urban development interests. The purpose of the participatory forum is to reduce conflicts of interest and to achieve consensus, with the ultimate goal of restoration of the original Everglades ecosystem, while cultivating the economic and cultural bases of the communities in the area. Further, the forum aim to formulate consensus through envisioning a common sustainable community by providing means to achieve a balance between human and natural systems. ^ Data were gathered using participant observation and document analysis techniques to conduct a theoretically based analysis of the role of the Network Management Coordinative Interstitial Group (NetMIG). I use conflict resolution theory, environmental conflict theory, stakeholder analysis, systems theory, differentiation and social change theory, and strategic management and planning theory. ^ The purpose of this study is to substantiate the role of the Governor's Commission for a Sustainable South Florida (GCSSF) as a consortium of organizations in an effort to resolve conflict rather than an ethnographic study of this organization. Environmental restoration of the Everglades is a vehicle for recognizing the significance of a Network Management Coordinative Interstitial Group (NetMIG), namely the Governor's Commission for a Sustainable South Florida (GCSSF), as a structural mechanism for stakeholder participation in the process of social conflict resolution through the creation of new cultural paradigms for a sustainable community. ^
Resumo:
The purpose of this inquiry was to investigate the perceptions of former service personnel, students and their parents about the organizational effectiveness of the Ghana National Service Scheme (GNSS). The inquiry addressed the following questions: How do the participants perceive the effectiveness of the national service program on the Ghanaian society? What are the perceptions of school administrators who worked with service personnel, parents and students vis-à-vis the over all impact of the GNSS on the educational system? What are the perceptions of former service personnel, students and their parents in regard to the impact of the GNSS on them? ^ The GNSS is a part within the ministry of education, which is also part in the Ghanaian social structure. Hence, a systems theory approach which asks, “How and why a system as a whole functions as it does” (Patton, 1990), was utilized in the study. Methodologies included purposive sampling; interviews; participant observation, and follow-up interviews. The study was conducted over a six-moth period. ^ A cross-sectional survey design was used to generate data. The survey was followed up with an ethnographic study where in-depth, face-to-face interviews were conducted together with observations. The results were described and interpreted. ^ The summary of findings concludes that perceptional determinants of the effectiveness of the GNSS were biased by the age and zone of origin but not gender. This partially agrees with Marenin's (1990) except for gender. A significant difference was detected between the responses of those who were officials of the National service Secretariat and of former service personnel. The administrators defended the status quo and demonstrated their deeper knowledge about the scheme. The former personnel and parents freely criticized the program when necessary and did not seem to know much about the GNSS. Respondents mostly stressed the need for the secretariat to focus on the following areas: (1) involvement in the agricultural sector of the economy, (2) involvement in rural mass, civic and health education, (3) adequacy of remuneration and personnel welfare, and (4) ensuring posting of personnel to areas of their expertise. Recommendation for further research concluded the study. ^
Resumo:
The Ellison Executive Mentoring Inclusive Community Building (ICB) Model is a paradigm for initiating and implementing projects utilizing executives and professionals from a variety of fields and industries, university students, and pre-college students. The model emphasizes adherence to ethical values and promotes inclusiveness in community development. It is a hierarchical model in which actors in each succeeding level of operation serve as mentors to the next. Through a three-step process--content, process, and product--participants must be trained with this mentoring and apprenticeship paradigm in conflict resolution, and they receive sensitivitiy and diversity training, through an interactive and dramatic exposition. The content phase introduces participants to the model's philosophy, ethics, values and methods of operation. The process used to teach and reinforce its precepts is the mentoring and apprenticeship activities and projects in which the participants engage and whose end product demontrates their knowledge and understanding of the model's concepts. This study sought to ascertain from the participants' perspectives whether the model's mentoring approach is an effective means of fostering inclusiveness, based upon their own experiences in using it. The research utilized a qualitative approach and included data from field observations, individual and group interviews, and written accounts of participants' attitudes. Participants complete ICB projects utilizing the Ellison Model as a method of development and implementation. They generally perceive that the model is a viable tool for dealing with diversity issues whether at work, at school, or at home. The projects are also instructional in that whether participants are mentored or seve as apprentices, they gain useful skills and knowledge about their careers. Since the model is relatively new, there is ample room for research in a variety of areas including organizational studies to dertmine its effectiveness in combating problems related to various kinds of discrimination.