951 resultados para Point-charge Model
Resumo:
Die Verbindung von elektrisch aktiven, lebenden Zellen zu extrazellulären Sensorsystemen eröffnet vielfälige Möglichkeiten im Bereich der Biosensorik. Die vorliegende Arbeit leistet einen Beitrag zum tieferen Verständnis der elektrischen Kopplungsmechanismen zwischen den biologischen und elektronischen Teilen solcher Hybridsysteme. Es wurden dazu drei Hauptbereiche bearbeitet:Ein System zur extrazellulären Signalableitung an lebenden Zellen bestehend aus einem Sensorchip, einem Vorverstärkerkopf und einem Hauptverstärker wurde weiterentwickelt.Als Sensoren wurden entweder Metallmikroelektroden-Chips mit 64 Kanälen oder Feldeffekt Transistoren-Chips mit 16 Kanälen (FET) eingesetzt. Es wurden zusätzlich spezielle FET Sensoren mit Rückseitenkontakten hergestellt und eingesetzt.Die elektrische Kopplung von einzelnen Nervenzellen der neuronalen Zell-Linien SH-SY5Y und TR14 oder primär kultivierten Neuronen aus dem Hirnstamm oder dem Hippocampus von embryonalen Ratten mit den extrazellulären Sensoren wurde untersucht. In der 'whole-cell' Patch-Clamp Technik wurden die Beiträge der spannungsgesteuerten Na+- und K+-Ionenkanäle zur extrazellulären Signalform identifiziert. Die Simulation der Signale mit einem Ersatzschaltkreis (Punkt-Kontakt Modell), der in PSPICE implementiert wurde, deutet auf eine starke Abhängigkeit der Signalformen in bezug auf Konzentrationsänderungen von Na+- und K+-Ionen im Volumenbereich zwischen Zelle und den ionensensitiven Transistoren hin. Ein empirisch erweitertes Punkt-Kontakt Modell wurde daraufhin vorgestellt.Im dritten Teil der Arbeit wurden Zellschichten von Kardiomyocyten embryonaler Ratten auf den extrazellulären Sensoren kultiviert. Die Eignung eines solchen Hybridsensors als Modellherz fuer das pharmazeutische Screeing wurde durch Messungen mit Herzstimulanzien und -relaktanzien bestätigt.
Resumo:
A patient-specific surface model of the proximal femur plays an important role in planning and supporting various computer-assisted surgical procedures including total hip replacement, hip resurfacing, and osteotomy of the proximal femur. The common approach to derive 3D models of the proximal femur is to use imaging techniques such as computed tomography (CT) or magnetic resonance imaging (MRI). However, the high logistic effort, the extra radiation (CT-imaging), and the large quantity of data to be acquired and processed make them less functional. In this paper, we present an integrated approach using a multi-level point distribution model (ML-PDM) to reconstruct a patient-specific model of the proximal femur from intra-operatively available sparse data. Results of experiments performed on dry cadaveric bones using dozens of 3D points are presented, as well as experiments using a limited number of 2D X-ray images, which demonstrate promising accuracy of the present approach.
Resumo:
Se toma como punto de partida el modelo de diseño de investigación cualitativa desarrollado por Joseph Maxwell, cuya concepción sobre el diseño de una investigación es el de una estructura subyacente basada en la interconexión de los componentes del estudio y las implicancias que estos tienen sobre otros, para analizar, si es posible, la corriente neorrealista, considerada como la escuela predominante en el estudio de las relaciones internacionales. El propósito de este trabajo -sustentado en el paradigma interpretativo, que conlleva como supuesto fundacional la necesaria comprensión del sentido de la acción social en el contexto del mundo de la vida y desde la perspectiva de los participantes- es el de relevar el aporte testimonial de quienes han conducido o participado activamente, es decir, los Ministros de Relaciones Exteriores, en la formación de la política exterior del país a partir de la vuelta de la democracia.
Resumo:
Se toma como punto de partida el modelo de diseño de investigación cualitativa desarrollado por Joseph Maxwell, cuya concepción sobre el diseño de una investigación es el de una estructura subyacente basada en la interconexión de los componentes del estudio y las implicancias que estos tienen sobre otros, para analizar, si es posible, la corriente neorrealista, considerada como la escuela predominante en el estudio de las relaciones internacionales. El propósito de este trabajo -sustentado en el paradigma interpretativo, que conlleva como supuesto fundacional la necesaria comprensión del sentido de la acción social en el contexto del mundo de la vida y desde la perspectiva de los participantes- es el de relevar el aporte testimonial de quienes han conducido o participado activamente, es decir, los Ministros de Relaciones Exteriores, en la formación de la política exterior del país a partir de la vuelta de la democracia.
Resumo:
Se toma como punto de partida el modelo de diseño de investigación cualitativa desarrollado por Joseph Maxwell, cuya concepción sobre el diseño de una investigación es el de una estructura subyacente basada en la interconexión de los componentes del estudio y las implicancias que estos tienen sobre otros, para analizar, si es posible, la corriente neorrealista, considerada como la escuela predominante en el estudio de las relaciones internacionales. El propósito de este trabajo -sustentado en el paradigma interpretativo, que conlleva como supuesto fundacional la necesaria comprensión del sentido de la acción social en el contexto del mundo de la vida y desde la perspectiva de los participantes- es el de relevar el aporte testimonial de quienes han conducido o participado activamente, es decir, los Ministros de Relaciones Exteriores, en la formación de la política exterior del país a partir de la vuelta de la democracia.
Resumo:
Over the past few years, the common practice within air traffic management has been that commercial aircraft fly by following a set of predefined routes to reach their destination. Currently, aircraft operators are requesting more flexibility to fly according to their prefer- ences, in order to achieve their business objectives. Due to this reason, much research effort is being invested in developing different techniques which evaluate aircraft optimal trajectory and traffic synchronisation. Also, the inefficient use of the airspace using barometric altitude overall in the landing and takeoff phases or in Continuous Descent Approach (CDA) trajectories where currently it is necessary introduce the necessary reference setting (QNH or QFE). To solve this problem and to permit a better airspace management born the interest of this research. Where the main goals will be to evaluate the impact, weakness and strength of the use of geometrical altitude instead of the use of barometric altitude. Moreover, this dissertation propose the design a simplified trajectory simulator which is able to predict aircraft trajectories. The model is based on a three degrees of freedom aircraft point mass model that can adapt aircraft performance data from Base of Aircraft Data, and meteorological information. A feature of this trajectory simulator is to support the improvement of the strategic and pre-tactical trajectory planning in the future Air Traffic Management. To this end, the error of the tool (aircraft Trajectory Simulator) is measured by comparing its performance variables with actual flown trajectories obtained from Flight Data Recorder information. The trajectory simulator is validated by analysing the performance of different type of aircraft and considering different routes. A fuel consumption estimation error was identified and a correction is proposed for each type of aircraft model. In the future Air Traffic Management (ATM) system, the trajectory becomes the fundamental element of a new set of operating procedures collectively referred to as Trajectory-Based Operations (TBO). Thus, governmental institutions, academia, and industry have shown a renewed interest for the application of trajectory optimisation techniques in com- mercial aviation. The trajectory optimisation problem can be solved using optimal control methods. In this research we present and discuss the existing methods for solving optimal control problems focusing on direct collocation, which has received recent attention by the scientific community. In particular, two families of collocation methods are analysed, i.e., Hermite-Legendre-Gauss-Lobatto collocation and the pseudospectral collocation. They are first compared based on a benchmark case study: the minimum fuel trajectory problem with fixed arrival time. For the sake of scalability to more realistic problems, the different meth- ods are also tested based on a real Airbus 319 El Cairo-Madrid flight. Results show that pseudospectral collocation, which has shown to be numerically more accurate and computa- tionally much faster, is suitable for the type of problems arising in trajectory optimisation with application to ATM. Fast and accurate optimal trajectory can contribute properly to achieve the new challenges of the future ATM. As atmosphere uncertainties are one of the most important issues in the trajectory plan- ning, the final objective of this dissertation is to have a magnitude order of how different is the fuel consumption under different atmosphere condition. Is important to note that in the strategic phase planning the optimal trajectories are determined by meteorological predictions which differ from the moment of the flight. The optimal trajectories have shown savings of at least 500 [kg] in the majority of the atmosphere condition (different pressure, and temperature at Mean Sea Level, and different lapse rate temperature) with respect to the conventional procedure simulated at the same atmosphere condition.This results show that the implementation of optimal profiles are beneficial under the current Air traffic Management (ATM).
Resumo:
The hydrophobic interaction, the tendency for nonpolar molecules to aggregate in solution, is a major driving force in biology. In a direct approach to the physical basis of the hydrophobic effect, nanosecond molecular dynamics simulations were performed on increasing numbers of hydrocarbon solute molecules in water-filled boxes of different sizes. The intermittent formation of solute clusters gives a free energy that is proportional to the loss in exposed molecular surface area with a constant of proportionality of 45 ± 6 cal/mol⋅Å2. The molecular surface area is the envelope of the solute cluster that is impenetrable by solvent and is somewhat smaller than the more traditional solvent-accessible surface area, which is the area transcribed by the radius of a solvent molecule rolled over the surface of the cluster. When we apply a factor relating molecular surface area to solvent-accessible surface area, we obtain 24 cal/mol⋅Å2. Ours is the first direct calculation, to our knowledge, of the hydrophobic interaction from molecular dynamics simulations; the excellent qualitative and quantitative agreement with experiment proves that simple van der Waals interactions and atomic point-charge electrostatics account for the most important driving force in biology.
Resumo:
Wireless sensor networks (WSNs) have shown wide applicability to many fields including monitoring of environmental, civil, and industrial settings. WSNs however are resource constrained by many competing factors that span their hardware, software, and networking. One of the central resource constrains is the charge consumption of WSN nodes. With finite energy supplies, low charge consumption is needed to ensure long lifetimes and success of WSNs. This thesis details the design of a power system to support long-term operation of WSNs. The power system’s development occurs in parallel with a custom WSN from the Queen’s MEMS Lab (QML-WSN), with the goal of supporting a 1+ year lifetime without sacrificing functionality. The final power system design utilizes a TPS62740 DC-DC converter with AA alkaline batteries to efficiently supply the nodes while providing battery monitoring functionality and an expansion slot for future development. Testing tools for measuring current draw and charge consumption were created along with analysis and processing software. Through their use charge consumption of the power system was drastically lowered and issues in QML-WSN were identified and resolved including the proper shutdown of accelerometers, and incorrect microcontroller unit (MCU) power pin connection. Controlled current profiling revealed unexpected behaviour of nodes and detailed current-voltage relationships. These relationships were utilized with a lifetime projection model to estimate a lifetime between 521-551 days, depending on the mode of operation. The power system and QML-WSN were tested over a long term trial lasting 272+ days in an industrial testbed to monitor an air compressor pump. Environmental factors were found to influence the behaviour of nodes leading to increased charge consumption, while a node in an office setting was still operating at the conclusion of the trail. This agrees with the lifetime projection and gives a strong indication that a 1+ year lifetime is achievable. Additionally, a light-weight charge consumption model was developed which allows charge consumption information of nodes in a distributed WSN to be monitored. This model was tested in a laboratory setting demonstrating +95% accuracy for high packet reception rate WSNs across varying data rates, battery supply capacities, and runtimes up to full battery depletion.
Resumo:
Bone tissue homeostasis relies upon the ability of cells to detect and interpret extracellular signals that direct changes in tissue architecture. This study utilized a four-point bending model to create both fluid shear and strain forces (loading) during the time-dependent progression of MC3T3-E1 preosteoblasts along the osteogenic lineage. Loading was shown to increase cell number, alkaline phosphatase (ALP) activity, collagen synthesis, and the mRNA expression levels of Runx2, osteocalcin (OC), osteopontin, and cyclo-oxygenase-2. However, mineralization in these cultures was inhibited, despite an increase in calcium accumulation, suggesting that loading may inhibit mineralization in order to increase matrix deposition. Loading also increased fibroblast growth factor receptor-3 (FGFR3) expression coincident with an inhibition of FGFR1, FGFR4, FGF1, and extracellular signal-related kinase (ERK)1/2 phosphorylation. To examine whether these loading-induced changes in cell phenotype and FGFR expression could be attributed to the inhibition of ERK1/2 phosphorylation, cells were grown for 25 days in the presence of the MEK1/2 inhibitor, U0126. Significant increases in the expression of FGFR3, ALP, and OC were observed, as well as the inhibition of FGFR1, FGFR4, and FGF1. However, U0126 also increased matrix mineralization, demonstrating that inhibition of ERK1/2 phosphorylation cannot fully account for the changes observed in response to loading. in conclusion, this study demonstrates that preosteoblasts are mechanoresponsive, and that long-term loading, whilst increasing proliferation and differentiation of preosteoblasts, inhibits matrix mineralization. In addition, the increase in FGFR3 expression suggests that it may have a role in osteoblast differentiation.
Resumo:
This paper presents the creation of 3D statistical shape models of the knee bones and their use to embed information into a segmentation system for MRIs of the knee. We propose utilising the strong spatial relationship between the cartilages and the bones in the knee by embedding this information into the created models. This information can then be used to automate the initialisation of segmentation algorithms for the cartilages. The approach used to automatically generate the 3D statistical shape models of the bones is based on the point distribution model optimisation framework of Davies. Our implementation of this scheme uses a parameterized surface extraction algorithm, which is used as the basis for the optimisation scheme that automatically creates the 3D statistical shape models. The current approach is illustrated by generating 3D statistical shape models of the patella, tibia and femoral bones from a segmented database of the knee. The use of these models to embed spatial relationship information to aid in the automation of segmentation algorithms for the cartilages is then illustrated.
Resumo:
This paper presents an automated segmentation approach for MR images of the knee bones. The bones are the first stage of a segmentation system for the knee, primarily aimed at the automated segmentation of the cartilages. The segmentation is performed using 3D active shape models (ASM), which are initialized using an affine registration to an atlas. The 3D ASMs of the bones are created automatically using a point distribution model optimization scheme. The accuracy and robustness of the segmentation approach was experimentally validated using an MR database of fat suppressed spoiled gradient recall images.
Resumo:
Atualmente, nota-se uma intensa movimentação de capitais financeiros, seja por conta de fusões e incorporações de empresas, seja pela expansão natural do próprio capitalismo, levando então as organizações a buscarem alternativas de financiamento com menores custos, isso quando consideradas as taxas de juros praticadas por instituições financeiras. Concomitantemente a isso, autoridades monetárias, circunstancialmente buscam a redução das taxas de juros que norteiam a economia, no intuito de se atrair novos investimentos produtivos e ainda preservar aqueles existentes. De maneira até paradoxal, a redução das taxas de juros promulgada por autoridades, não exibe a mesma proporção de redução daquelas praticadas pelo mercado. Este aspecto leva os indivíduos, sejam eles gestores de investimentos ou não, a buscarem alternativas de investimentos que proporcionem ganhos monetários superiores àqueles que são fundamentados nas taxas estabelecidas pelas autoridades monetárias. Conciliando a busca de recursos por organizações e a busca por maiores ganhos monetários por parte dos investidores, o mercado de capitais se torna uma alternativa relevante. De modo a conseguir os melhores resultados nesse ambiente, há necessidade de se utilizar modelos e outros instrumentos que propiciem a melhor relação entre risco e retorno, haja vista que todo investidor emite ao menos alguma aversão ao risco. Vários são os instrumentos disponíveis para realizar essas relações, entretanto, muitos deles não acessíveis ao investidor na condição de pessoa física. E mediante esse aspecto, o modelo desenvolvido por Edwin Elton e Martin Gruber surge como alternativa a qualquer investidor, seja por suas características construtivas, seja por sua operacionalidade.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
Wireless sensor networks (WSNs) have shown wide applicability to many fields including monitoring of environmental, civil, and industrial settings. WSNs however are resource constrained by many competing factors that span their hardware, software, and networking. One of the central resource constrains is the charge consumption of WSN nodes. With finite energy supplies, low charge consumption is needed to ensure long lifetimes and success of WSNs. This thesis details the design of a power system to support long-term operation of WSNs. The power system’s development occurs in parallel with a custom WSN from the Queen’s MEMS Lab (QML-WSN), with the goal of supporting a 1+ year lifetime without sacrificing functionality. The final power system design utilizes a TPS62740 DC-DC converter with AA alkaline batteries to efficiently supply the nodes while providing battery monitoring functionality and an expansion slot for future development. Testing tools for measuring current draw and charge consumption were created along with analysis and processing software. Through their use charge consumption of the power system was drastically lowered and issues in QML-WSN were identified and resolved including the proper shutdown of accelerometers, and incorrect microcontroller unit (MCU) power pin connection. Controlled current profiling revealed unexpected behaviour of nodes and detailed current-voltage relationships. These relationships were utilized with a lifetime projection model to estimate a lifetime between 521-551 days, depending on the mode of operation. The power system and QML-WSN were tested over a long term trial lasting 272+ days in an industrial testbed to monitor an air compressor pump. Environmental factors were found to influence the behaviour of nodes leading to increased charge consumption, while a node in an office setting was still operating at the conclusion of the trail. This agrees with the lifetime projection and gives a strong indication that a 1+ year lifetime is achievable. Additionally, a light-weight charge consumption model was developed which allows charge consumption information of nodes in a distributed WSN to be monitored. This model was tested in a laboratory setting demonstrating +95% accuracy for high packet reception rate WSNs across varying data rates, battery supply capacities, and runtimes up to full battery depletion.
Resumo:
Mestrado em Auditoria