20 resultados para Phenomenological theory (Physics)
em Aston University Research Archive
Resumo:
OBJECTIVES: To understand older adults' experiences of moving into extra care housing which offers enrichment activities alongside social and healthcare support. DESIGN: A longitudinal study was conducted which adopted a phenomenological approach to data generation and analysis. METHODS: Semi-structured interviews were conducted in the first 18 months of living in extra care housing. Interpretative phenomenological analysis was used because its commitment to idiography enabled an in-depth analysis of the subjective lived experience of moving into extra care housing. Themes generated inductively were examined against an existential-phenomenological theory of well-being. RESULTS: Learning to live in an extra care community showed negotiating new relationships was not straightforward; maintaining friendships outside the community became more difficult as capacity declined. In springboard for opportunity/confinement, living in extra care provided new opportunities for social engagement and a restored sense of self. Over time horizons began to shrink as incapacities grew. Seeking care illustrated reticence to seek care, due to embarrassment and a sense of duty to one's partner. Becoming aged presented an ontological challenge. Nevertheless, some showed a readiness for death, a sense of homecoming. CONCLUSIONS: An authentic later life was possible but residents required emotional and social support to live through the transition and challenges of becoming aged. Enhancement activities boosted residents' quality of life but the range of activities could be extended to cater better for quieter, smaller scale events within the community; volunteer activity facilitators could be used here. Peer mentoring may help build new relationships and opportunities for interactive stimulation. Acknowledging the importance of feeling-empathic imagination-in caregiving may help staff and residents relate better to each other, thus helping individuals to become ontologically secure and live well to the end.
Resumo:
We investigated family members’ lived experience of Parkinson’s disease (PD) aiming to investigate opportunities for well-being. A lifeworld-led approach to healthcare was adopted. Interpretative phenomenological analysis was used to explore in-depth interviews with people living with PD and their partners. The analysis generated four themes: It’s more than just an illness revealed the existential challenge of diagnosis; Like a bird with a broken wing emphasizing the need to adapt to increasing immobility through embodied agency; Being together with PD exploring the kinship within couples and belonging experienced through support groups; and Carpe diem! illuminated the significance of time and fractured future orientation created by diagnosis. Findings were interpreted using an existential-phenomenological theory of well-being. We highlighted how partners shared the impact of PD in their own ontological challenges. Further research with different types of families and in different situations is required to identify services required to facilitate the process of learning to live with PD. Care and support for the family unit needs to provide emotional support to manage threats to identity and agency alongside problem-solving for bodily changes. Adopting a lifeworld-led healthcare approach would increase opportunities for well-being within the PD illness journey.
Resumo:
We present a review of the latest developments in one-dimensional (1D) optical wave turbulence (OWT). Based on an original experimental setup that allows for the implementation of 1D OWT, we are able to show that an inverse cascade occurs through the spontaneous evolution of the nonlinear field up to the point when modulational instability leads to soliton formation. After solitons are formed, further interaction of the solitons among themselves and with incoherent waves leads to a final condensate state dominated by a single strong soliton. Motivated by the observations, we develop a theoretical description, showing that the inverse cascade develops through six-wave interaction, and that this is the basic mechanism of nonlinear wave coupling for 1D OWT. We describe theory, numerics and experimental observations while trying to incorporate all the different aspects into a consistent context. The experimental system is described by two coupled nonlinear equations, which we explore within two wave limits allowing for the expression of the evolution of the complex amplitude in a single dynamical equation. The long-wave limit corresponds to waves with wave numbers smaller than the electrical coherence length of the liquid crystal, and the opposite limit, when wave numbers are larger. We show that both of these systems are of a dual cascade type, analogous to two-dimensional (2D) turbulence, which can be described by wave turbulence (WT) theory, and conclude that the cascades are induced by a six-wave resonant interaction process. WT theory predicts several stationary solutions (non-equilibrium and thermodynamic) to both the long- and short-wave systems, and we investigate the necessary conditions required for their realization. Interestingly, the long-wave system is close to the integrable 1D nonlinear Schrödinger equation (NLSE) (which contains exact nonlinear soliton solutions), and as a result during the inverse cascade, nonlinearity of the system at low wave numbers becomes strong. Subsequently, due to the focusing nature of the nonlinearity, this leads to modulational instability (MI) of the condensate and the formation of solitons. Finally, with the aid of the probability density function (PDF) description of WT theory, we explain the coexistence and mutual interactions between solitons and the weakly nonlinear random wave background in the form of a wave turbulence life cycle (WTLC).
Resumo:
In this paper we review recent theoretical approaches for analysing the dynamics of on-line learning in multilayer neural networks using methods adopted from statistical physics. The analysis is based on monitoring a set of macroscopic variables from which the generalisation error can be calculated. A closed set of dynamical equations for the macroscopic variables is derived analytically and solved numerically. The theoretical framework is then employed for defining optimal learning parameters and for analysing the incorporation of second order information into the learning process using natural gradient descent and matrix-momentum based methods. We will also briefly explain an extension of the original framework for analysing the case where training examples are sampled with repetition.
Resumo:
Book review
Resumo:
A major problem in modern probabilistic modeling is the huge computational complexity involved in typical calculations with multivariate probability distributions when the number of random variables is large. Because exact computations are infeasible in such cases and Monte Carlo sampling techniques may reach their limits, there is a need for methods that allow for efficient approximate computations. One of the simplest approximations is based on the mean field method, which has a long history in statistical physics. The method is widely used, particularly in the growing field of graphical models. Researchers from disciplines such as statistical physics, computer science, and mathematical statistics are studying ways to improve this and related methods and are exploring novel application areas. Leading approaches include the variational approach, which goes beyond factorizable distributions to achieve systematic improvements; the TAP (Thouless-Anderson-Palmer) approach, which incorporates correlations by including effective reaction terms in the mean field theory; and the more general methods of graphical models. Bringing together ideas and techniques from these diverse disciplines, this book covers the theoretical foundations of advanced mean field methods, explores the relation between the different approaches, examines the quality of the approximation obtained, and demonstrates their application to various areas of probabilistic modeling.
Resumo:
The modem digital communication systems are made transmission reliable by employing error correction technique for the redundancies. Codes in the low-density parity-check work along the principles of Hamming code, and the parity-check matrix is very sparse, and multiple errors can be corrected. The sparseness of the matrix allows for the decoding process to be carried out by probability propagation methods similar to those employed in Turbo codes. The relation between spin systems in statistical physics and digital error correcting codes is based on the existence of a simple isomorphism between the additive Boolean group and the multiplicative binary group. Shannon proved general results on the natural limits of compression and error-correction by setting up the framework known as information theory. Error-correction codes are based on mapping the original space of words onto a higher dimensional space in such a way that the typical distance between encoded words increases.
Resumo:
We propose a method based on the magnetization enumerator to determine the critical noise level for Gallager type low density parity check error correcting codes (LDPC). Our method provides an appealingly simple interpretation to the relation between different decoding schemes, and provides more optimistic critical noise levels than those reported in the information theory literature.
Resumo:
Properties of computing Boolean circuits composed of noisy logical gates are studied using the statistical physics methodology. A formula-growth model that gives rise to random Boolean functions is mapped onto a spin system, which facilitates the study of their typical behavior in the presence of noise. Bounds on their performance, derived in the information theory literature for specific gates, are straightforwardly retrieved, generalized and identified as the corresponding macroscopic phase transitions. The framework is employed for deriving results on error-rates at various function-depths and function sensitivity, and their dependence on the gate-type and noise model used. These are difficult to obtain via the traditional methods used in this field.
Resumo:
The introduction situates the ‘hard problem’ in its historical context and argues that the problem has two sides: the output side (the Kant-Eccles problem of the freedom of the Will) and the input side (the problem of qualia). The output side ultimately reduces to whether quantum mechanics can affect the operation of synapses. A discussion of the detailed molecular biology of synaptic transmission as presently understood suggests that such affects are unlikely. Instead an evolutionary argument is presented which suggests that our conviction of free agency is an evolutionarily induced illusion and hence that the Kant-Eccles problem is itself illusory. This conclusion is supported by well-known neurophysiology. The input side, the problem of qualia, of subjectivity, is not so easily outflanked. After a brief review of the neurophysiological correlates of consciousness (NCC) and of the Penrose-Hameroff microtubular neuroquantology it is again concluded that the molecular neurobiology makes quantum wave-mechanics an unlikely explanation. Instead recourse is made to an evolutionarily- and neurobiologically-informed panpsychism. The notion of an ‘emergent’ property is carefully distinguished from that of the more usual ‘system’ property used by most dual-aspect theorists (and the majority of neuroscientists) and used to support Llinas’ concept of an ‘oneiric’ consciousness continuously modified by sensory input. I conclude that a panpsychist theory, such as this, coupled with the non-classical understanding of matter flowing from quantum physics (both epistemological and scientific) may be the default and only solution to the problem posed by the presence of mind in a world of things.
Resumo:
Gain insight into crucial British mental health approaches for LGB individuals. There is very little collaborative literature between LGB-affirmative psychologists and psychotherapists in the United States and the United Kingdom. British Lesbian, Gay, and Bisexual Psychologies: Theory, Research, and Practice may well be a crucial beginning step in building dialogue between these two countries on important LGB psychotherapy developments. Leading authorities comprehensively examine the latest studies and effective therapies for LGB individuals in the United Kingdom. Practitioners will discover an extensive survey of the most current developments to supplement their own work, while educators and students will find diverse expert perspectives on which to consider and broaden their own viewpoints. This unique book offers an informative introduction to British psychosocial perspectives on theory, research, and practice. British Lesbian, Gay, and Bisexual Psychologies provides a critical exploration of the recent history of LGB psychology and psychotherapy in the United Kingdom, focusing on key publications and outlining the current terrain. Other chapters are organized into two thematic sections. The first section explores theoretical frameworks in United Kingdom therapeutic practice, while the second section examines sexual minority identities and their needs for support and community. Topics in British Lesbian, Gay, and Bisexual Psychologies include: - similarities and differences between LGBT psychology and psychotherapy in the United States and United Kingdom - gay affirmative therapy (GAT) as a positive framework - existential-phenomenological approach to psychotherapy - core issues in the anxiety about whether or not to “come out” - object relations theory - exploring homo-negativity in the therapeutic process - aspects of psychotherapy that lesbians and gay men find helpful - research into how the mainstreaming of lesbian and gay culture has affected the lives of LGB individuals - study into LGB youth issues - difficulties of gay men with learning disabilities—with suggestions on how to offer the best psychological service - a study on gay athletes’ experiences of coming out in a heterosexist world British Lesbian, Gay, and Bisexual Psychologies takes a needed step toward sharing valuable psychosocial perspectives between countries. This useful, enlightening text is perfect for educators, students, psychologists, psychotherapists, and counselors working in the field of sexuality.
Resumo:
Inference and optimization of real-value edge variables in sparse graphs are studied using the Bethe approximation and replica method of statistical physics. Equilibrium states of general energy functions involving a large set of real edge variables that interact at the network nodes are obtained in various cases. When applied to the representative problem of network resource allocation, efficient distributed algorithms are also devised. Scaling properties with respect to the network connectivity and the resource availability are found, and links to probabilistic Bayesian approximation methods are established. Different cost measures are considered and algorithmic solutions in the various cases are devised and examined numerically. Simulation results are in full agreement with the theory. © 2007 The American Physical Society.
Resumo:
We present a mean field theory of code-division multiple access (CDMA) systems with error-control coding. On the basis of the relation between the free energy and mutual information, we obtain an analytical expression of the maximum spectral efficiency of the coded CDMA system, from which a mean field description of the coded CDMA system is provided in terms of a bank of scalar Gaussian channels whose variances in general vary at different code symbol positions. Regular low-density parity-check (LDPC)-coded CDMA systems are also discussed as an example of the coded CDMA systems.
Resumo:
In this thesis we use statistical physics techniques to study the typical performance of four families of error-correcting codes based on very sparse linear transformations: Sourlas codes, Gallager codes, MacKay-Neal codes and Kanter-Saad codes. We map the decoding problem onto an Ising spin system with many-spins interactions. We then employ the replica method to calculate averages over the quenched disorder represented by the code constructions, the arbitrary messages and the random noise vectors. We find, as the noise level increases, a phase transition between successful decoding and failure phases. This phase transition coincides with upper bounds derived in the information theory literature in most of the cases. We connect the practical decoding algorithm known as probability propagation with the task of finding local minima of the related Bethe free-energy. We show that the practical decoding thresholds correspond to noise levels where suboptimal minima of the free-energy emerge. Simulations of practical decoding scenarios using probability propagation agree with theoretical predictions of the replica symmetric theory. The typical performance predicted by the thermodynamic phase transitions is shown to be attainable in computation times that grow exponentially with the system size. We use the insights obtained to design a method to calculate the performance and optimise parameters of the high performance codes proposed by Kanter and Saad.
Resumo:
By using an alternative setup for photorefractive parametric oscillation in which wave mixing between the recording beams is avoided it has become possible to make more detailed comparisons with the space-charge wave theory. In the present paper we compare the experimental features of longitudinal parametric oscillation observed in a crystal of Bi12SiO20 with the theoretical predictions.