945 resultados para Mathematical Techniques - Integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neoplastic tissue is typically highly vascularized, contains abnormal concentrations of extracellular proteins (e.g. collagen, proteoglycans) and has a high interstitial fluid pres- sure compared to most normal tissues. These changes result in an overall stiffening typical of most solid tumors. Elasticity Imaging (EI) is a technique which uses imaging systems to measure relative tissue deformation and thus noninvasively infer its mechanical stiffness. Stiffness is recovered from measured deformation by using an appropriate mathematical model and solving an inverse problem. The integration of EI with existing imaging modal- ities can improve their diagnostic and research capabilities. The aim of this work is to develop and evaluate techniques to image and quantify the mechanical properties of soft tissues in three dimensions (3D). To that end, this thesis presents and validates a method by which three dimensional ultrasound images can be used to image and quantify the shear modulus distribution of tissue mimicking phantoms. This work is presented to motivate and justify the use of this elasticity imaging technique in a clinical breast cancer screening study. The imaging methodologies discussed are intended to improve the specificity of mammography practices in general. During the development of these techniques, several issues concerning the accuracy and uniqueness of the result were elucidated. Two new algorithms for 3D EI are designed and characterized in this thesis. The first provides three dimensional motion estimates from ultrasound images of the deforming ma- terial. The novel features include finite element interpolation of the displacement field, inclusion of prior information and the ability to enforce physical constraints. The roles of regularization, mesh resolution and an incompressibility constraint on the accuracy of the measured deformation is quantified. The estimated signal to noise ratio of the measured displacement fields are approximately 1800, 21 and 41 for the axial, lateral and eleva- tional components, respectively. The second algorithm recovers the shear elastic modulus distribution of the deforming material by efficiently solving the three dimensional inverse problem as an optimization problem. This method utilizes finite element interpolations, the adjoint method to evaluate the gradient and a quasi-Newton BFGS method for optimiza- tion. Its novel features include the use of the adjoint method and TVD regularization with piece-wise constant interpolation. A source of non-uniqueness in this inverse problem is identified theoretically, demonstrated computationally, explained physically and overcome practically. Both algorithms were test on ultrasound data of independently characterized tissue mimicking phantoms. The recovered elastic modulus was in all cases within 35% of the reference elastic contrast. Finally, the preliminary application of these techniques to tomosynthesis images showed the feasiblity of imaging an elastic inclusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Serotonin is a neurotransmitter that has been linked to a wide variety of behaviors including feeding and body-weight regulation, social hierarchies, aggression and suicidality, obsessive compulsive disorder, alcoholism, anxiety, and affective disorders. Full understanding of serotonergic systems in the central nervous system involves genomics, neurochemistry, electrophysiology, and behavior. Though associations have been found between functions at these different levels, in most cases the causal mechanisms are unknown. The scientific issues are daunting but important for human health because of the use of selective serotonin reuptake inhibitors and other pharmacological agents to treat disorders in the serotonergic signaling system. METHODS: We construct a mathematical model of serotonin synthesis, release, and reuptake in a single serotonergic neuron terminal. The model includes the effects of autoreceptors, the transport of tryptophan into the terminal, and the metabolism of serotonin, as well as the dependence of release on the firing rate. The model is based on real physiology determined experimentally and is compared to experimental data. RESULTS: We compare the variations in serotonin and dopamine synthesis due to meals and find that dopamine synthesis is insensitive to the availability of tyrosine but serotonin synthesis is sensitive to the availability of tryptophan. We conduct in silico experiments on the clearance of extracellular serotonin, normally and in the presence of fluoxetine, and compare to experimental data. We study the effects of various polymorphisms in the genes for the serotonin transporter and for tryptophan hydroxylase on synthesis, release, and reuptake. We find that, because of the homeostatic feedback mechanisms of the autoreceptors, the polymorphisms have smaller effects than one expects. We compute the expected steady concentrations of serotonin transporter knockout mice and compare to experimental data. Finally, we study how the properties of the the serotonin transporter and the autoreceptors give rise to the time courses of extracellular serotonin in various projection regions after a dose of fluoxetine. CONCLUSIONS: Serotonergic systems must respond robustly to important biological signals, while at the same time maintaining homeostasis in the face of normal biological fluctuations in inputs, expression levels, and firing rates. This is accomplished through the cooperative effect of many different homeostatic mechanisms including special properties of the serotonin transporters and the serotonin autoreceptors. Many difficult questions remain in order to fully understand how serotonin biochemistry affects serotonin electrophysiology and vice versa, and how both are changed in the presence of selective serotonin reuptake inhibitors. Mathematical models are useful tools for investigating some of these questions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ethnomathematical research, together with digital technologies (WebQuest) and Drama-in- Education (DiE) techniques, can create a fruitful learning environment in a mathematics classroom—a hybrid/third space—enabling increased student participation and higher levels of cognitive engagement. This article examines how ethnomathematical ideas processed within the experiential environment established by the Drama-in-Education techniques challenged students‘ conceptions of the nature of mathematics, the ways in which students engaged with mathematics learning using mind and body, and the ̳dialogue‘ that was developed between the Discourse situated in a particular practice and the classroom Discourse of mathematics teaching. The analysis focuses on an interdisciplinary project based on an ethnomathematical study of a designing tradition carried out by the researchers themselves, involving a search for informal mathematics and the connections with context and culture; 10th grade students in a public school in Athens were introduced to the mathematics content via an original WebQuest based on this previous ethnomathematical study; Geometry content was further introduced and mediated using the Drama-in-Education (DiE) techniques. Students contributed in an unfolding dialogue between formal and informal knowledge, renegotiating both mathematical concepts and their perception of mathematics as a discipline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For sensitive optoelectronic components, traditional soldering techniques cannot be used because of their inherent sensitivity to thermal stresses. One such component is the Optoelectronic Butterfly Package which houses a laser diode chip aligned to a fibre-optic cable. Even sub-micron misalignment of the fibre optic and laser diode chip can significantly reduce the performance of the device. The high cost of each unit requires that the number of damaged components, via the laser soldering process, are kept to a minimum. Mathematical modelling is undertaken to better understand the laser soldering process and to optimize operational parameters such as solder paste volume, copper pad dimensions, laser solder times for each joint, laser intensity and absorption coefficient. Validation of the model against experimental data will be completed, and will lead to an optimization of the assembly process, through an iterative modelling cycle. This will ultimately reduce costs, improve the process development time and increase consistency in the laser soldering process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a method for the integration of several numerical analytical techniques that are used in microsystems design and failure analysis is presented. The analytical techniques are categorized into four groups in the discussion, namely the high-fidelity analytical tools, i.e. finite element (FE) method, the fast analytical tools referring to reduced order modeling (ROM); the optimization tools, and probability based analytical tools. The characteristics of these four tools are investigated. The interactions between the four tools are discussed and a methodology for the coupling of these four tools is offered. This methodology consists of three stages, namely reduced order modeling, deterministic optimization and probabilistic optimization. Using this methodology, a case study for optimization of a solder joint is conducted. It is shown that these analysis techniques have mutual relationship of interaction and complementation. Synthetic application of these techniques can fully utilize the advantages of these techniques and satisfy various design requirements. The case study shows that the coupling method of different tools provided by this paper is effective and efficient and it is highly relevant in the design and reliability analysis of microsystems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A design methodology based on numerical modelling, integrated with optimisation techniques and statistical methods, to aid the process control of micro and nano-electronics based manufacturing processes is presented in this paper. The design methodology is demonstrated for a micro-machining process called Focused Ion Beam (FIB). This process has been modelled to help understand how a pre-defined geometry of micro- and nano- structures can be achieved using this technology. The process performance is characterised on the basis of developed Reduced Order Models (ROM) and are generated using results from a mathematical model of the Focused Ion Beam and Design of Experiment (DoE) methods. Two ion beam sources, Argon and Gallium ions, have been used to compare and quantify the process variable uncertainties that can be observed during the milling process. The evaluations of the process performance takes into account the uncertainties and variations of the process variables and are used to identify their impact on the reliability and quality of the fabricated structure. An optimisation based design task is to identify the optimal process conditions, by varying the process variables, so that certain quality objectives and requirements are achieved and imposed constraints are satisfied. The software tools used and developed to demonstrate the design methodology are also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article studies the gender values that are promoted both in the literacy courses for gypsy women beneficiaries of the Social Integration Revenue Policy of the Region of Madrid and in the events that are organized for this group by public institutions and NGOs. The process of “socialization” that occurs in the educative groups for Gypsy women is focused on constructing an image of what it is to be a “Gypsy modern woman”. Through multiple mechanisms and discursive techniques a specific conception of gender equality is transmitted in these educative spaces. In addition to this, Gypsy women are continually urged to assume certain values and social practices (of gender identity, of "citizenship", of parenting, etc..), while an archetype of "Gypsy Woman" which condenses powerful stereotypes and prejudices about the "Gypsy culture" and the gender relations characteristics of this group is constructed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artifact removal from physiological signals is an essential component of the biosignal processing pipeline. The need for powerful and robust methods for this process has become particularly acute as healthcare technology deployment undergoes transition from the current hospital-centric setting toward a wearable and ubiquitous monitoring environment. Currently, determining the relative efficacy and performance of the multiple artifact removal techniques available on real world data can be problematic, due to incomplete information on the uncorrupted desired signal. The majority of techniques are presently evaluated using simulated data, and therefore, the quality of the conclusions is contingent on the fidelity of the model used. Consequently, in the biomedical signal processing community, there is considerable focus on the generation and validation of appropriate signal models for use in artifact suppression. Most approaches rely on mathematical models which capture suitable approximations to the signal dynamics or underlying physiology and, therefore, introduce some uncertainty to subsequent predictions of algorithm performance. This paper describes a more empirical approach to the modeling of the desired signal that we demonstrate for functional brain monitoring tasks which allows for the procurement of a ground truth signal which is highly correlated to a true desired signal that has been contaminated with artifacts. The availability of this ground truth, together with the corrupted signal, can then aid in determining the efficacy of selected artifact removal techniques. A number of commonly implemented artifact removal techniques were evaluated using the described methodology to validate the proposed novel test platform. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last decade there has been a rapid global increase in wind power stimulated by energy and climate policies. However, as wind power is inherently variable and stochastic over a range of time scales, additional system balancing is required to ensure system reliability and stability. This paper reviews the technical, policy and market challenges to achieving ambitious wind power penetration targets in Ireland’s All-Island Grid and examines a number of measures proposed to address these challenges. Current government policy in Ireland is to address these challenges with additional grid reinforcement, interconnection and open-cycle gas plant. More recently smart grid combined with demand side management and electric vehicles have also been presented as options to mitigate the variability of wind power. In addition, the transmission system operators have developed wind farm specific grid codes requiring improved turbine controls and wind power forecasting techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability of building information modeling (BIM) to positively impact projects in the AEC through greater collaboration and integration is widely acknowledged. This paper aims to examine the development of BIM and how it can contribute to the cold-formed steel (CFS) building industry. This is achieved through the adoption of a qualitative methodology encompassing a literature review, exploratory interviews with industry experts, culminating in the development of e-learning material for the sector. In doing so, the research team have collaborated with one of the United Kingdom’s largest cold-formed steel designer/fabricators. By demonstrating the capabilities of BIM software and providing technical and informative videos in its creation, this project has found two key outcomes. Firstly, to provide invaluable assistance in the transition from traditional processes to a fully collaborative 3D BIM as required by the UK Government under the “Government Construction Strategy” by 2016 in all public sector projects. Secondly, to demonstrate BIM’s potential not only within CFS companies, but also within the AEC sector as a whole. As the flexibility, adaptability and interoperability of BIM software is alluded to, the results indicate that the introduction and development of BIM and the underlying ethos suggests that it is a key tool in the development of the industry as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chili powder is a globally traded commodity which has been found to be adulterated with Sudan dyes from 2003 onwards. In this study, chili powders were adulterated with varying quantities of Sudan I dye (0.1-5%) and spectra were generated using near infrared reflectance spectroscopy (NIRS) and Raman
spectroscopy (on a spectrometer with a sample compartment modified as part of the study). Chemometrics were applied to the spectral data to produce quantitative and qualitative calibration models and prediction statistics. For the quantitative models coefficients of determination (R2) were found to be
0.891-0.994 depending on which spectral data (NIRS/Raman) was processed, the mathematical algorithm used and the data pre-processing applied. The corresponding values for the root mean square error of calibration (RMSEC) and root mean square error of prediction (RMSEP) were found to be 0.208-0.851%
and 0.141-0.831% respectively, once again depending on the spectral data and the chemometric treatment applied to the data. Indications are that the NIR spectroscopy based models are superior to the models produced from Raman spectral data based on a comparison of the values of the chemometric
parameters. The limit of detection (LOD) based on analysis of 20 blank chili powders against each calibration model gave 0.25% and 0.88% for the NIR and Raman data, respectively. In addition, adopting a qualitative approach with the spectral data and applying PCA or PLS-DA, it was possible to discriminate
between adulterated chili powders from non-adulterated chili powders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study applies spatial statistical techniques including cokriging to integrate airborne geophysical (radiometric) data with ground-based measurements of peat depth and soil organic carbon (SOC) to monitor change in peat cover for carbon stock calculations. The research is part of the EU funded Tellus Border project and is supported by the INTERREG IVA development programme of the European Regional Development Fund, which is managed by the Special EU Programmes Body (SEUPB). The premise is that saturated peat attenuates the radiometric signal from underlying soils and rocks. Contemporaneous ground-based measurements were collected to corroborate mapped estimates and develop a statistical model for volumetric carbon content (VCC) to 0.5 metres. Field measurements included ground penetrating radar, gamma ray spectrometry and a soil sampling methodology which measured bulk density and soil moisture to determine VCC. One aim of the study was to explore whether airborne radiometric survey data can be used to establish VCC across a region. To account for the footprint of airborne radiometric data, five cores were obtained at each soil sampling location: one at the centre of the ground radiometric equivalent sample location and one at each of the four corners 20 metres apart. This soil sampling strategy replicated the methodology deployed for the Tellus Border geochemistry survey. Two key issues will be discussed from this work. The first addresses the integration of different sampling supports for airborne and ground measured data and the second discusses the compositional nature of the VOC data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Esta tese insere-se na área da simulação de circuitos de RF e microondas, e visa o estudo de ferramentas computacionais inovadoras que consigam simular, de forma eficiente, circuitos não lineares e muito heterogéneos, contendo uma estrutura combinada de blocos analógicos de RF e de banda base e blocos digitais, a operar em múltiplas escalas de tempo. Os métodos numéricos propostos nesta tese baseiam-se em estratégias multi-dimensionais, as quais usam múltiplas variáveis temporais definidas em domínios de tempo deformados e não deformados, para lidar, de forma eficaz, com as disparidades existentes entre as diversas escalas de tempo. De modo a poder tirar proveito dos diferentes ritmos de evolução temporal existentes entre correntes e tensões com variação muito rápida (variáveis de estado activas) e correntes e tensões com variação lenta (variáveis de estado latentes), são utilizadas algumas técnicas numéricas avançadas para operar dentro dos espaços multi-dimensionais, como, por exemplo, os algoritmos multi-ritmo de Runge-Kutta, ou o método das linhas. São também apresentadas algumas estratégias de partição dos circuitos, as quais permitem dividir um circuito em sub-circuitos de uma forma completamente automática, em função dos ritmos de evolução das suas variáveis de estado. Para problemas acentuadamente não lineares, são propostos vários métodos inovadores de simulação a operar estritamente no domínio do tempo. Para problemas com não linearidades moderadas é proposto um novo método híbrido frequência-tempo, baseado numa combinação entre a integração passo a passo unidimensional e o método seguidor de envolvente com balanço harmónico. O desempenho dos métodos é testado na simulação de alguns exemplos ilustrativos, com resultados bastante promissores. Uma análise comparativa entre os métodos agora propostos e os métodos actualmente existentes para simulação RF, revela ganhos consideráveis em termos de rapidez de computação.