964 resultados para Simplified and advanced calculation methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review our work on generalisations of the Becker-Doring model of cluster-formation as applied to nucleation theory, polymer growth kinetics, and the formation of upramolecular structures in colloidal chemistry. One valuable tool in analysing mathematical models of these systems has been the coarse-graining approximation which enables macroscopic models for observable quantities to be derived from microscopic ones. This permits assumptions about the detailed molecular mechanisms to be tested, and their influence on the large-scale kinetics of surfactant self-assembly to be elucidated. We also summarise our more recent results on Becker-Doring systems, notably demonstrating that cross-inhibition and autocatalysis can destabilise a uniform solution and lead to a competitive environment in which some species flourish at the expense of others, phenomena relevant in models of the origins of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN] Parasitic diseases have a great impact in human and animal health. The gold standard for the diagnosis of the majority of parasitic infections is still conventional microscopy, which presents important limitations in terms of sensitivity and specificity and commonly requires highly trained technicians. More accurate molecular-based diagnostic tools are needed for the implementation of early detection, effective treatments and massive screenings with high-throughput capacities. In this respect, sensitive and affordable devices could greatly impact on sustainable control programmes which exist against parasitic diseases, especially in low income settings. Proteomics and nanotechnology approaches are valuable tools for sensing pathogens and host alteration signatures within micro fluidic detection platforms. These new devices might provide novel solutions to fight parasitic diseases. Newly described specific parasite derived products with immune-modulatory properties have been postulated as the best candidates for the early and accurate detection of parasitic infections as well as for the blockage of parasite development. This review provides the most recent methodological and technological advances with great potential for biosensing parasites in their hosts, showing the newest opportunities offered by modern “-omics” and platforms for parasite detection and control.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lectures on COMP6234

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Scientific research is increasingly data-intensive, relying more and more upon advanced computational resources to be able to answer the questions most pressing to our society at large. This report presents findings from a brief descriptive survey sent to a sample of 342 leading researchers at the University of Washington (UW), Seattle, Washington in 2010 and 2011 as the first stage of the larger National Science Foundation project “Interacting with Cyberinfrastructure in the Face of Changing Science.” This survey assesses these researcher’s use of advanced computational resources, data, and software in their research. We present high-level findings that describe UW researchers’: demographics, interdisciplinarity, research groups, data use, software and computational use—including software development and use, data storage and transfer activities, and collaboration tools, and computing resources. These findings offer insights into the state of computational resources in use during this time period as well as offering a look at the data intensiveness of UW researchers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The idea of spacecraft formations, flying in tight configurations with maximum baselines of a few hundred meters in low-Earth orbits, has generated widespread interest over the last several years. Nevertheless, controlling the movement of spacecraft in formation poses difficulties, such as in-orbit high-computing demand and collision avoidance capabilities, which escalate as the number of units in the formation is increased and complicated nonlinear effects are imposed to the dynamics, together with uncertainty which may arise from the lack of knowledge of system parameters. These requirements have led to the need of reliable linear and nonlinear controllers in terms of relative and absolute dynamics. The objective of this thesis is, therefore, to introduce new control methods to allow spacecraft in formation, with circular/elliptical reference orbits, to efficiently execute safe autonomous manoeuvres. These controllers distinguish from the bulk of literature in that they merge guidance laws never applied before to spacecraft formation flying and collision avoidance capacities into a single control strategy. For this purpose, three control schemes are presented: linear optimal regulation, linear optimal estimation and adaptive nonlinear control. In general terms, the proposed control approaches command the dynamical performance of one or several followers with respect to a leader to asymptotically track a time-varying nominal trajectory (TVNT), while the threat of collision between the followers is reduced by repelling accelerations obtained from the collision avoidance scheme during the periods of closest proximity. Linear optimal regulation is achieved through a Riccati-based tracking controller. Within this control strategy, the controller provides guidance and tracking toward a desired TVNT, optimizing fuel consumption by Riccati procedure using a non-infinite cost function defined in terms of the desired TVNT, while repelling accelerations generated from the CAS will ensure evasive actions between the elements of the formation. The relative dynamics model, suitable for circular and eccentric low-Earth reference orbits, is based on the Tschauner and Hempel equations, and includes a control input and a nonlinear term corresponding to the CAS repelling accelerations. Linear optimal estimation is built on the forward-in-time separation principle. This controller encompasses two stages: regulation and estimation. The first stage requires the design of a full state feedback controller using the state vector reconstructed by means of the estimator. The second stage requires the design of an additional dynamical system, the estimator, to obtain the states which cannot be measured in order to approximately reconstruct the full state vector. Then, the separation principle states that an observer built for a known input can also be used to estimate the state of the system and to generate the control input. This allows the design of the observer and the feedback independently, by exploiting the advantages of linear quadratic regulator theory, in order to estimate the states of a dynamical system with model and sensor uncertainty. The relative dynamics is described with the linear system used in the previous controller, with a control input and nonlinearities entering via the repelling accelerations from the CAS during collision avoidance events. Moreover, sensor uncertainty is added to the control process by considering carrier-phase differential GPS (CDGPS) velocity measurement error. An adaptive control law capable of delivering superior closed-loop performance when compared to the certainty-equivalence (CE) adaptive controllers is finally presented. A novel noncertainty-equivalence controller based on the Immersion and Invariance paradigm for close-manoeuvring spacecraft formation flying in both circular and elliptical low-Earth reference orbits is introduced. The proposed control scheme achieves stabilization by immersing the plant dynamics into a target dynamical system (or manifold) that captures the desired dynamical behaviour. They key feature of this methodology is the addition of a new term to the classical certainty-equivalence control approach that, in conjunction with the parameter update law, is designed to achieve adaptive stabilization. This parameter has the ultimate task of shaping the manifold into which the adaptive system is immersed. The performance of the controller is proven stable via a Lyapunov-based analysis and Barbalat’s lemma. In order to evaluate the design of the controllers, test cases based on the physical and orbital features of the Prototype Research Instruments and Space Mission Technology Advancement (PRISMA) are implemented, extending the number of elements in the formation into scenarios with reconfigurations and on-orbit position switching in elliptical low-Earth reference orbits. An extensive analysis and comparison of the performance of the controllers in terms of total Δv and fuel consumption, with and without the effects of the CAS, is presented. These results show that the three proposed controllers allow the followers to asymptotically track the desired nominal trajectory and, additionally, those simulations including CAS show an effective decrease of collision risk during the performance of the manoeuvre.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, we carried out a comparative analysis between two classical methodologies to prospect residue contacts in proteins: the traditional cutoff dependent (CD) approach and cutoff free Delaunay tessellation (DT). In addition, two alternative coarse-grained forms to represent residues were tested: using alpha carbon (CA) and side chain geometric center (GC). A database was built, comprising three top classes: all alpha, all beta, and alpha/beta. We found that the cutoff value? at about 7.0 A emerges as an important distance parameter.? Up to 7.0 A, CD and DT properties are unified, which implies that at this distance all contacts are complete and legitimate (not occluded). We also have shown that DT has an intrinsic missing edges problem when mapping the first layer of neighbors. In proteins, it may produce systematic errors affecting mainly the contact network in beta chains with CA. The almost-Delaunay (AD) approach has been proposed to solve this DT problem. We found that even AD may not be an advantageous solution. As a consequence, in the strict range up ? to 7.0 A, the CD approach revealed to be a simpler, more complete, and reliable technique than DT or AD. Finally, we have shown that coarse-grained residue representations may introduce bias in the analysis of neighbors in cutoffs up to ? 6.8 A, with CA favoring alpha proteins and GC favoring beta proteins. This provides an additional argument pointing to ? the value of 7.0 A as an important lower bound cutoff to be used in contact analysis of proteins.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cultural heritage is constituted by complex and heterogenous materials, such as paintings but also ancient remains. However, all ancient materials are exposed to external environment and their interaction produces different changes due to chemical, physical and biological phenomena. The organic fraction, especially the proteinaceous one, has a crucial role in all these materials: in archaeology proteins reveal human habits, in artworks they disclose technics and help for a correct restoration. For these reasons the development of methods that allow the preservation of the sample as much as possible and a deeper knowledge of the deterioration processes is fundamental. The research activities presented in this PhD thesis have been focused on the development of new immunochemical and spectroscopic approaches in order to detect and identify organic substances in artistic and archaeological samples. Organic components could be present in different cultural heritage materials as constituent element (e.g., binders in paintings, collagen in bones) and their knowledge is fundamental for a complete understanding of past life, degradation processes and appropriate restauration approaches. The combination of immunological approach with a chemiluminescence detection and Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry allowed a sensitive and selective localization of collagen and elements in ancient bones and teeth. Near-infrared spectrometer and hyper spectral imaging have been applied in combination with chemometric data analysis as non-destructive methods for bones prescreening for the localization of collagen. Moreover, an investigation of amino acids in enamel has been proposed, in order to clarify teeth biomolecules survival overtime through the optimization and application of High-Performance Liquid Chromatography on modern and ancient enamel powder. New portable biosensors were developed for ovalbumin identification in paintings, thanks to the combination between biocompatible Gellan gel and electro-immunochemical sensors, to extract and identify painting binders with the contact only between gel and painting and between gel and electrodes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Doctoral Thesis aims at studying, developing, and characterizing cutting edge equipment for EMC measurements and proposing innovative and advanced power line filter design techniques. This document summarizes a three-year work, is strictly industry oriented and relies on EMC standards and regulations. It contains the main results, findings, and effort with the purpose of bringing innovative contributions at the scientific community. Conducted emissions interferences are usually suppressed with power line filters. These filters are composed by common mode chokes, X capacitors and Y capacitors in order to mitigate both the differential mode and common mode noise, which compose the overall conducted emissions. However, even at present days, available power line filter design techniques show several disadvantages. First of all, filters are designed to be implemented in ideal 50 Ω systems, condition which is far away from reality. Then, the attenuation introduced by the filter for common or differential mode noise is analyzed independently, without considering the possible mode conversion that can be produced by impedance mismatches, or asymmetries in either the power line filter itself or the equipment under test. Ultimately, the instrumentation used to perform conducted emissions measurement is, in most cases, not adequate. All these factors lead to an inaccurate design, contributing at increasing the size of the filter, making it more expensive and less performant than it should be.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The simple single-ion activity coefficient equation originating from the Debye-Hückel theory was used to determine the thermodynamic and stoichiometric dissociation constants of weak acids from data concerning galvanic cells. Electromotive force data from galvanic cells without liquid junctions, which was obtained from literature, was studied in conjuction with the potentiometric titration data relating to aqueous solutions at 298.15 K. The dissociation constants of weak acids could be determined by the presented techniques and almost all the experimental data studied could be interpreted within the range of experimental error. Potentiometric titration has been used here and the calculation methods were developed to obtain the thermodynamic and stoichiometric dissociation constants of some weak acids in aqueous solutions at 298.15 K. The ionic strength of titrated solutions were adjusted using an inert electrolyte, namely, sodium or potassium chloride. Salt content alonedetermines the ionic strength. The ionic strength of the solutions studied varied from 0.059 mol kg-1 to 0.37 mol kg-1, and in some cases up to 1.0 mol kg-1. The following substances were investigated using potentiometric titration: aceticacid, propionic acid, L-aspartic acid, L-glutamic acid and bis(2,2-dimethyl-3-oxopropanol) amine.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Breast cancer (BC) causes more deaths than any other cancer among women in Catalonia. Early detection has contributed to the observed decline in BC mortality. However, there is debate on the optimal screening strategy. We performed an economic evaluation of 20 screening strategies taking into account the cost over time of screening and subsequent medical costs, including diagnostic confirmation, initial treatment, follow-up and advanced care. Methods: We used a probabilistic model to estimate the effect and costs over time of each scenario. The effect was measured as years of life (YL), quality-adjusted life years (QALY), and lives extended (LE). Costs of screening and treatment were obtained from the Early Detection Program and hospital databases of the IMAS-Hospital del Mar in Barcelona. The incremental cost-effectiveness ratio (ICER) was used to compare the relative costs and outcomes of different scenarios. Results: Strategies that start at ages 40 or 45 and end at 69 predominate when the effect is measured as YL or QALYs. Biennial strategies 50-69, 45-69 or annual 45-69, 40-69 and 40-74 were selected as cost-effective for both effect measures (YL or QALYs). The ICER increases considerably when moving from biennial to annual scenarios. Moving from no screening to biennial 50-69 years represented an ICER of 4,469€ per QALY. Conclusions: A reduced number of screening strategies have been selected for consideration by researchers, decision makers and policy planners. Mathematical models are useful to assess the impact and costs of BC screening in a specific geographical area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän diplomityön tavoitteena oli kehittää hiilidioksiditaseen hallintamenetelmää Rautaruukin toiminnoille päästökaupan olosuhteissa. Taseenhallintamenetelmä sisältää päästöjen laskennan sekä päästöoikeuksien hallintaan liittyviä asioita. EU:n laajuisen päästökaupan reunaehdot määrittelee päästökauppadirektiivi ja sen antama päästöjen seurantaa ja raportointia koskeva monitorointiohje. Työssä on tarkasteltu hiilidioksidipäästöhistoriaa ja laskentamenetelmiä niiden Rautaruukin toimipaikkojen kohdalta, joiden oletetaan kuuluvan EU:n päästökaupan piiriin. Toimipaikoista on tarkasteltu erityisesti Raahen ja Koverharin terästehtaita, sillä ne muodostavat merkittävimmän osuuden konsernin Suomen toimipaikkojen hiilidioksidipäästöistä. Muita tarkasteltavia toimipaikkoja ovat Hämeenlinnan ja Dalsbrukin valssaamot Suomessa, Smedjebackenin terästehdas ja Boxholmin valssaamo Ruotsissa, Mo i Ranan terästehdas ja Profilerin valssaamo Norjassa sekä Nedstaalin valssaamo Hollannissa. Kustannustehokkaan ja hallitun päästökaupankäynnin perustaksi yritystasolla tarvitaan päästötaseenhallintamenetelmä, jonka avulla voidaan määrittää syntyneet päästöt komission monitorointiohjeen vaatimalla tavalla, arvioida tulevia päästömääriä sekä hallita päästökaupankäyntiä. Päästökaupanhallintaan sisältyviä asioita ovat saadut ilmaiset päästöoikeudet, ostettavien tai myytävien oikeuksien määrä, kaupankäynnin ajankohta, päästöoikeuksien erilaiset hankintamahdollisuudet, päästöoikeuksien hinnanmuodostus ja riskienhallinta.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Diplomityön tavoitteena on selvittää Loviisan ydinvoimalaitoksen höyryturbiinin hyötysuhteen parantamismahdollisuuksia. Työn kuvaan liittyvät oleellisesti höyryturbiinin siipivyöhykkeiden nopeuskolmioiden sekä hyötysuhteiden laskenta. Höyryturbiinien kehityskaarta sekä turbiinin häviökerrointen laskentayhtälöitä on esitetty useasta eri lähteestä ja vuosikymmeniltä. Työssä selvitettiin uusimpia ydinvoimalaitosten kostea höyryturbiinien suunnitteluperusteita lukuisista eri lähteistä. Kaikkien lähteiden mukaan kostean höyryn alueella tapahtuvaa paisuntaa on haasteellista mallintaa. Työssä on esitelty artikkeleissa tulleita eri näkökulmia höyryturbiinien suorituskyvyn parantamiseksi, sekä rakenteellisia että laskennallisia. Työssä esitellään monia turbiinin virtauksen ja suorituskyvyn laskentamenetelmiä. Esimerkiksi Baumannin säännön laskenta on yksinkertainen tapa käsitellä turbiinin suorituskykyä kostean höyryn alueella. Keskeisimpiä tehtyjä havaintoja oli se, että korkeapaineturbiinin ensimmäisestä vaiheesta löytyi mahdollista parannuspotentiaalia Loviisaan ydinvoimalaitoksen tehon lisäämiseksi. Ensimmäisessä vaiheessa on oletettu siipien olevan Laval –tyyppisiä, mutta käytännössä näin ei ole. Korkeapaineturbiinin nykyisen turbosuuttimen toimintaa voitaisiin tehostaa. Lisäksi Loviisan matalapaineturbiinin viimeisen siipivaiheen jälkeen aiheutuu suuret ulosvirtaushäviöt. Osa suurinopeuksisen virtauksen energiasta pystyttäisiin kuitenkin hyödyntämään vielä ulosvirtauskanavassa olevalla diffuusorilla.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although it has been suggested that retinal vasculature is a diffusion-limited aggregation (DLA) fractal, no study has been dedicated to standardizing its fractal analysis . The aims of this project was to standardize a method to estimate the fractal dimensions of retinal vasculature and to characterize their normal values; to determine if this estimation is dependent on skeletization and on segmentation and calculation methods; to assess the suitability of the DLA model and to determine the usefulness of log-log graphs in characterizing vasculature fractality . To achieve these aims, the information, mass-radius and box counting dimensions of 20 eyes vasculatures were compared when the vessels were manually or computationally segmented; the fractal dimensions of the vasculatures of 60 eyes of healthy volunteers were compared with those of 40 DLA models and the log-log graphs obtained were compared with those of known fractals and those of non-fractals. The main results were: the fractal dimensions of vascular trees were dependent on segmentation methods and dimension calculation methods, but there was no difference between manual segmentation and scale-space, multithreshold and wavelet computational methods; the means of the information and box dimensions for arteriolar trees were 1.29. against 1.34 and 1.35 for the venular trees; the dimension for the DLA models were higher than that for vessels; the log-log graphs were straight, but with varying local slopes, both for vascular trees and for fractals and non-fractals. This results leads to the following conclusions: the estimation of the fractal dimensions for retinal vasculature is dependent on its skeletization and on the segmentation and calculation methods; log-log graphs are not suitable as a fractality test; the means of the information and box counting dimensions for the normal eyes were 1.47 and 1.43, respectively, and the DLA model with optic disc seeding is not sufficient for retinal vascularization modeling