22 resultados para Finite Volume Methods
Resumo:
Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.
Resumo:
Background: There is substantial evidence that cognitive deficits and brain structural abnormalities are present in patients with Bipolar Disorder (BD) and in their first-degree relatives. Previous studies have demonstrated associations between cognition and functional outcome in BD patients but have not examined the role of brain morphological changes. Similarly, the functional impact of either cognition or brain morphology in relatives remains unknown. Therefore we focused on delineating the relationship between psychosocial functioning, cognition and brain structure, in relation to disease expression and genetic risk for BD. Methods: Clinical, cognitive and brain structural measures were obtained from 41 euthymic BD patients and 50 of their unaffected first-degree relatives. Psychosocial function was evaluated using the General Assessment of Functioning (GAF) scale. We examined the relationship between level of functioning and general intellectual ability (IQ), memory, attention, executive functioning, symptomatology, illness course and total gray matter, white matter and cerebrospinal fluid volumes. Limitations: Cross-sectional design. Results: Multiple regression analyses revealed that IQ, total white matter volume and a predominantly depressive illness course were independently associated with functional outcome in BD patients, but not in their relatives, and accounted for a substantial proportion (53%) of the variance in patients' GAF scores. There were no significant domain-specific associations between cognition and outcome after consideration of IQ. Conclusions: Our results emphasise the role of IQ and white matter integrity in relation to outcome in BD and carry significant implications for treatment interventions. © 2010 Elsevier B.V.
Resumo:
The human accommodation system has been extensively examined for over a century, with a particular focus on trying to understand the mechanisms that lead to the loss of accommodative ability with age (Presbyopia). The accommodative process, along with the potential causes of presbyopia, are disputed; hindering efforts to develop methods of restoring accommodation in the presbyopic eye. One method that can be used to provide insight into this complex area is Finite Element Analysis (FEA). The effectiveness of FEA in modelling the accommodative process has been illustrated by a number of accommodative FEA models developed to date. However, there have been limitations to these previous models; principally due to the variation in data on the geometry of the accommodative components, combined with sparse measurements of their material properties. Despite advances in available data, continued oversimplification has occurred in the modelling of the crystalline lens structure and the zonular fibres that surround the lens. A new accommodation model was proposed by the author that aims to eliminate these limitations. A novel representation of the zonular structure was developed, combined with updated lens and capsule modelling methods. The model has been designed to be adaptable so that a range of different age accommodation systems can be modelled, allowing the age related changes that occur to be simulated. The new modelling methods were validated by comparing the changes induced within the model to available in vivo data, leading to the definition of three different age models. These were used in an extended sensitivity study on age related changes, where individual parameters were altered to investigate their effect on the accommodative process. The material properties were found to have the largest impact on the decline in accommodative ability, in particular compared to changes in ciliary body movement or zonular structure. Novel data on the importance of the capsule stiffness and thickness was also established. The new model detailed within this thesis provides further insight into the accommodation mechanism, as well as a foundation for future, more detailed investigations into accommodation, presbyopia and accommodative restoration techniques.
Resumo:
Respiratory-volume monitoring is an indispensable part of mechanical ventilation. Here we present a new method of the respiratory-volume measurement based on a single fibre-optical long-period sensor of bending and the correlation between torso curvature and lung volume. Unlike the commonly used air-flow based measurement methods the proposed sensor is drift-free and immune to air-leaks. In the paper, we explain the working principle of sensors, a two-step calibration-test measurement procedure and present results that establish a linear correlation between the change in the local thorax curvature and the change of the lung volume. We also discuss the advantages and limitations of these sensors with respect to the current standards. © 2013 IEEE.
Resumo:
The computational mechanics approach has been applied to the orientational behavior of water molecules in a molecular dynamics simulated water–Na + system. The distinctively different statistical complexity of water molecules in the bulk and in the first solvation shell of the ion is demonstrated. It is shown that the molecules undergo more complex orientational motion when surrounded by other water molecules compared to those constrained by the electric field of the ion. However the spatial coordinates of the oxygen atom shows the opposite complexity behavior in that complexity is higher for the solvation shell molecules. New information about the dynamics of water molecules in the solvation shell is provided that is additional to that given by traditional methods of analysis.
Resumo:
One of the most pressing demands on electrophysiology applied to the diagnosis of epilepsy is the non-invasive localization of the neuronal generators responsible for brain electrical and magnetic fields (the so-called inverse problem). These neuronal generators produce primary currents in the brain, which together with passive currents give rise to the EEG signal. Unfortunately, the signal we measure on the scalp surface doesn't directly indicate the location of the active neuronal assemblies. This is the expression of the ambiguity of the underlying static electromagnetic inverse problem, partly due to the relatively limited number of independent measures available. A given electric potential distribution recorded at the scalp can be explained by the activity of infinite different configurations of intracranial sources. In contrast, the forward problem, which consists of computing the potential field at the scalp from known source locations and strengths with known geometry and conductivity properties of the brain and its layers (CSF/meninges, skin and skull), i.e. the head model, has a unique solution. The head models vary from the computationally simpler spherical models (three or four concentric spheres) to the realistic models based on the segmentation of anatomical images obtained using magnetic resonance imaging (MRI). Realistic models – computationally intensive and difficult to implement – can separate different tissues of the head and account for the convoluted geometry of the brain and the significant inter-individual variability. In real-life applications, if the assumptions of the statistical, anatomical or functional properties of the signal and the volume in which it is generated are meaningful, a true three-dimensional tomographic representation of sources of brain electrical activity is possible in spite of the ‘ill-posed’ nature of the inverse problem (Michel et al., 2004). The techniques used to achieve this are now referred to as electrical source imaging (ESI) or magnetic source imaging (MSI). The first issue to influence reconstruction accuracy is spatial sampling, i.e. the number of EEG electrodes. It has been shown that this relationship is not linear, reaching a plateau at about 128 electrodes, provided spatial distribution is uniform. The second factor is related to the different properties of the source localization strategies used with respect to the hypothesized source configuration.
Resumo:
Book review: Organizations in Time, edited by R Daniel Wadhwani and Marcelo Bucheli, Oxford University Press, 2014. The title of this edited volume is slightly misleading, as its various contributions explore the potential for more historical analysis in organization studies rather than addressing issues associated with time and organizing. Hopefully this will not distract from the important achievement of this volume—important especially for business historians—in further expanding and integrating business history into management and organization studies. The various contributions, elegantly tied together by R. Daniel Wadhwani and Marcelo Bucheli in their substantial introduction (which, by the way, presents a significant contribution in its own right), opens up new sets of questions, especially in terms of future methodological and theoretical developments in the field. This book also reflects the changing institutional location of business historians, who increasingly make their careers in business schools rather than history departments, especially in Europe, reopening old questions of history as a social science. There have been several calls to teach more history in business education, such as the Carnegie Foundation report (2011) that found undergraduate business education too narrow in focus and highlighted the need to integrate more liberal arts teaching into the curriculum. However, in the contemporary research-driven environment of business and management schools, historical understanding is unlikely to permeate the curriculum if historical analysis cannot first deliver significant theoretical contributions. This is the central theme around which this edited volume revolves, and it marks a milestone in this ongoing debate. (In the spirit of full disclosure, I should add that even though I did not contribute to this volume, I have coauthored with several of its contributors and view this book as central to my current research practice.)