12 resultados para Mind-body problem
em Aston University Research Archive
Resumo:
The revolution in the foundations of physics at the beginning of the twentieth century suggested to several of its most prominent workers that biology was ripe for something similar. In consequence, a number of physicists moved into biology. They were highly influential in initiating a molecular biology in the 1950s. Two decades later it seemed to several of these migrants, and those they had influenced, that the major problems in molecular biology had been solved, and that it was time to move on to what seemed to them the final problem: the nervous system, consciousness, and the age-old mind-body problem. This paper reviews this "double migration" and shows how the hopes of the first generation of physicist-biologists were both realized and dashed. No new physical principles were discovered at work in the foundations of biology or neuroscience. On the other hand, the mind-set of those trained in physics proved immensely valuable in analyzing fundamental issues in both biology and neuroscience. It has been argued that the outcome of the molecular biology of the 1950s was a change in the concept of the gene from that of "a mysterious entity into that of a real molecular object" (Watson, 1965, p.6); the gates and channels which play such crucial roles in the functioning of nervous systems have been transformed in a similar way. Studies on highly simplified systems have also opened the prospect of finding the neural correlatives of numerous behaviors and neuropathologies. This increasing understanding at the molecular level is invaluable not only in devising rational therapies but also, by defining the material substrate of consciousness, in bringing the mind-body problem into sharper focus. Copyright © Taylor & Francis Inc.
Resumo:
The key to the correct application of ANOVA is careful experimental design and matching the correct analysis to that design. The following points should therefore, be considered before designing any experiment: 1. In a single factor design, ensure that the factor is identified as a 'fixed' or 'random effect' factor. 2. In more complex designs, with more than one factor, there may be a mixture of fixed and random effect factors present, so ensure that each factor is clearly identified. 3. Where replicates can be grouped or blocked, the advantages of a randomised blocks design should be considered. There should be evidence, however, that blocking can sufficiently reduce the error variation to counter the loss of DF compared with a randomised design. 4. Where different treatments are applied sequentially to a patient, the advantages of a three-way design in which the different orders of the treatments are included as an 'effect' should be considered. 5. Combining different factors to make a more efficient experiment and to measure possible factor interactions should always be considered. 6. The effect of 'internal replication' should be taken into account in a factorial design in deciding the number of replications to be used. Where possible, each error term of the ANOVA should have at least 15 DF. 7. Consider carefully whether a particular factorial design can be considered to be a split-plot or a repeated measures design. If such a design is appropriate, consider how to continue the analysis bearing in mind the problem of using post hoc tests in this situation.
Resumo:
This thesis is concerned with establishing where the Buddhist tradition, founded in India some 2500 years ago, can make a contribution to the new and growing discipline of business ethics. Part One: From the growing body of business ethics literature it seems that business managers increasingly have a problem of learning how to respond to public and political pressure on business to behave more ethically while, at the same time, continuing to run their affairs profitably in an increasingly complex and uncertain business environment. Part One first looks at the evidence for this growing interest and at the nature of the `business ethics problem', and then reviews the contribution of Western theory to solving it. Part Two: In Part Two a possible solution which overcomes some of the limitations of Western theory is described. This is based on a Buddhist analysis of individual morality, and of the moral relationship between the individual and the group. From this a general theoretical framework is proposed. To show how it can be practically applied to the needs of business a description is then given of how the framework was used to design and test a pilot `moral audit' of Windhorse Trading, a Buddhist company based in Cambridge, England. From the results of this pilot study it is concluded that, given some additional research, it would be possible to take the theoretical framework further and use it as the basis for developing operational guidelines to help businesses to make detailed ethical decisions.
Resumo:
All four of the most important figures in the early twentieth-century development of quantum physics-Niels Bohr, Erwin Schroedinger, Werner Heisenberg and Wolfgang Pauli-had strong interests in the traditional mind-brain, or 'hard,' problem. This paper reviews their approach to this problem, showing the influence of Bohr's complementarity thesis, the significance of Schroedinger's small book, 'What is life?,' the updated Platonism of Heisenberg and, perhaps most interesting of all, the interaction of Carl Jung and Wolfgang Pauli in the latter's search for a unification of mind and matter. © 2005 Elsevier Inc. All rights reserved.
Resumo:
This is the second part of a review of the work of quantum physicists on the ‘hard part’ of the problem of mind. After an introduction which sets the scene and a brief review of contemporary work on the neural correlates of consciousness (NCC) the work of four prominent modern investigators is examined: J.C. Eccles/Friedrich Beck; Henry Stapp; Stuart Hameroff/Roger Penrose; David Bohm. With the exception of David Bohm, all attempt to show where in the brain’s microstructure quantum affects could make themselves felt. It is reluctantly concluded that none have neurobiological plausibility. They are all instances, to paraphrase T.H. Huxley, of a beautiful hypothesis destroyed by ugly facts. David Bohm does not attempt to fit his new quantum physics to contemporary neurobiology but instead asks for a radical rethink of our conventional scientific paradigm. He suggests that we should look towards developing a ‘pan-experientialism’ or ‘dual-aspect monism’ where consciousness goes ‘all the way down’ and that the ‘hard problem’ is not soluble within the framework of ideas provided by ‘classical’ natural science.
Resumo:
Background: Electrosurgery units are widely employed in modern surgery. Advances in technology have enhanced the safety of these devices, nevertheless, accidental burns are still regularly reported. This study focuses on possible causes of sacral burns as complication of the use of electrosurgery. Burns are caused by local densifications of the current, but the actual pathway of current within patient's body is unknown. Numerical electromagnetic analysis can help in understanding the issue. Methods: To this aim, an accurate heterogeneous model of human body (including seventy-seven different tissues), electrosurgery electrodes, operating table and mattress was build to resemble a typical surgery condition. The patient lays supine on the mattress with the active electrode placed onto the thorax and the return electrode on his back. Common operating frequencies of electrosurgery units were considered. Finite Difference Time Domain electromagnetic analysis was carried out to compute the spatial distribution of current density within the patient's body. A differential analysis by changing the electrical properties of the operating table from a conductor to an insulator was also performed. Results: Results revealed that distributed capacitive coupling between patient body and the conductive operating table offers an alternative path to the electrosurgery current. The patient's anatomy, the positioning and the different electromagnetic properties of tissues promote a densification of the current at the head and sacral region. In particular, high values of current density were located behind the sacral bone and beneath the skin. This did not occur in the case of non-conductive operating table. Conclusion: Results of the simulation highlight the role played from capacitive couplings between the return electrode and the conductive operating table. The concentration of current density may result in an undesired rise in temperature, originating burns in body region far from the electrodes. This outcome is concordant with the type of surgery-related sacral burns reported in literature. Such burns cannot be immediately detected after surgery, but appear later and can be confused with bedsores. In addition, the dosimetric analysis suggests that reducing the capacity coupling between the return electrode and the operating table can decrease or avoid this problem. © 2013 Bifulco et al.; licensee BioMed Central Ltd.
Resumo:
From Platonic and Galenic roots, the first well developed ventricular theory of brain function is due to Bishop Nemesius, fourth century C.E. Although more interested in the Christian concept of soul, St. Augustine, too addressed the question of the location of the soul, a problem that has endured in various guises to the present day. Other notable contributions to ventricular psychology are the ninth century C.E. Arabic writer, Qusta ibn Lūqā, and an early European medical text written by the twelfth century C.E. author, Nicolai the Physician. By the time of Albertus Magnus, so-called medieval cell doctrine was a well-developed model of brain function. By the sixteenth century, Vesalius no longer understands the ventricles to be imaginary cavities designed to provide a physical basis for faculty psychology but as fluid-filled spaces in the brain whose function is yet to be determined
Resumo:
This thesis addressed the problem of risk analysis in mental healthcare, with respect to the GRiST project at Aston University. That project provides a risk-screening tool based on the knowledge of 46 experts, captured as mind maps that describe relationships between risks and patterns of behavioural cues. Mind mapping, though, fails to impose control over content, and is not considered to formally represent knowledge. In contrast, this thesis treated GRiSTs mind maps as a rich knowledge base in need of refinement; that process drew on existing techniques for designing databases and knowledge bases. Identifying well-defined mind map concepts, though, was hindered by spelling mistakes, and by ambiguity and lack of coverage in the tools used for researching words. A novel use of the Edit Distance overcame those problems, by assessing similarities between mind map texts, and between spelling mistakes and suggested corrections. That algorithm further identified stems, the shortest text string found in related word-forms. As opposed to existing approaches’ reliance on built-in linguistic knowledge, this thesis devised a novel, more flexible text-based technique. An additional tool, Correspondence Analysis, found patterns in word usage that allowed machines to determine likely intended meanings for ambiguous words. Correspondence Analysis further produced clusters of related concepts, which in turn drove the automatic generation of novel mind maps. Such maps underpinned adjuncts to the mind mapping software used by GRiST; one such new facility generated novel mind maps, to reflect the collected expert knowledge on any specified concept. Mind maps from GRiST are stored as XML, which suggested storing them in an XML database. In fact, the entire approach here is ”XML-centric”, in that all stages rely on XML as far as possible. A XML-based query language allows user to retrieve information from the mind map knowledge base. The approach, it was concluded, will prove valuable to mind mapping in general, and to detecting patterns in any type of digital information.