874 resultados para development methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: To develop a simple, fast and sensitive spectrophotometric method for the determination of tofisopam in tablet dosage form. Methods: Tofisopam as n-electron donor was reacted with two π-acceptors, namely, chloranilic acid (ChA), and 7,7,8,8 tetracyanoquinodimethane (TCNQ) to form charge transfer complexes. The complexes were evaluated spectrophotometrically at 520 and 824 nm for ChA and TCNQ, respectively. The optimum conditions for the reaction were determined and optimized. The developed method was compared with Japanese Pharmacopeia method. Results: The calibration curve was linear in the ranges 25 – 125 and 30 – 150 μg/mL for ChA and TCNQ, respectively. The lower limit of detection was 8.0 and 10.0 μg/mL for ChA and TCNQ, respectively while the slope and intercept of the calibration curves were 0.0025 and 0.011 and 0.0115 and -0.237, for ChA and TCNQ, respectively. Conclusion: The developed methods for tofisopam have good accuracy and precision, and comparable to a standard pharmacopeial method. The methods can be applied for routine analysis and in quality control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Recent marine long-offset transient electromagnetic (LOTEM) measurements yielded the offshore delineation of a fresh groundwater body beneath the seafloor in the region of Bat Yam, Israel. The LOTEM application was effective in detecting this freshwater body underneath the Mediterranean Sea and allowed an estimation of its seaward extent. However, the measured data set was insufficient to understand the hydrogeological configuration and mechanism controlling the occurrence of this fresh groundwater discovery. Especially the lateral geometry of the freshwater boundary, important for the hydrogeological modelling, could not be resolved. Without such an understanding, a rational management of this unexploited groundwater reservoir is not possible. Two new high-resolution marine time-domain electromagnetic methods are theoretically developed to derive the hydrogeological structure of the western aquifer boundary. The first is called Circular Electric Dipole (CED). It is the land-based analogous of the Vertical Electric Dipole (VED), which is commonly applied to detect resistive structures in the subsurface. Although the CED shows exceptional detectability characteristics in the step-off signal towards the sub-seafloor freshwater body, an actual application was not carried out in the extent of this study. It was found that the method suffers from an insufficient signal strength to adequately delineate the resistive aquifer under realistic noise conditions. Moreover, modelling studies demonstrated that severe signal distortions are caused by the slightest geometrical inaccuracies. As a result, a successful application of CED in Israel proved to be rather doubtful. A second method called Differential Electric Dipole (DED) is developed as an alternative to the intended CED method. Compared to the conventional marine time-domain electromagnetic system that commonly applies a horizontal electric dipole transmitter, the DED is composed of two horizontal electric dipoles in an in-line configuration that share a common central electrode. Theoretically, DED has similar detectability/resolution characteristics compared to the conventional LOTEM system. However, the superior lateral resolution towards multi-dimensional resistivity structures make an application desirable. Furthermore, the method is less susceptible towards geometrical errors making an application in Israel feasible. In the extent of this thesis, the novel marine DED method is substantiated using several one-dimensional (1D) and multi-dimensional (2D/3D) modelling studies. The main emphasis lies on the application in Israel. Preliminary resistivity models are derived from the previous marine LOTEM measurement and tested for a DED application. The DED method is effective in locating the two-dimensional resistivity structure at the western aquifer boundary. Moreover, a prediction regarding the hydrogeological boundary conditions are feasible, provided a brackish water zone exists at the head of the interface. A seafloor-based DED transmitter/receiver system is designed and built at the Institute of Geophysics and Meteorology at the University of Cologne. The first DED measurements were carried out in Israel in April 2016. The acquired data set is the first of its kind. The measured data is processed and subsequently interpreted using 1D inversion. The intended aim of interpreting both step-on and step-off signals failed, due to the insufficient data quality of the latter. Yet, the 1D inversion models of the DED step-on signals clearly detect the freshwater body for receivers located close to the Israeli coast. Additionally, a lateral resistivity contrast is observable in the 1D inversion models that allow to constrain the seaward extent of this freshwater body. A large-scale 2D modelling study followed the 1D interpretation. In total, 425 600 forward calculations are conducted to find a sub-seafloor resistivity distribution that adequately explains the measured data. The results indicate that the western aquifer boundary is located at 3600 m - 3700 m before the coast. Moreover, a brackish water zone of 3 Omega*m to 5 Omega*m with a lateral extent of less than 300 m is likely located at the head of the freshwater aquifer. Based on these results, it is predicted that the sub-seafloor freshwater body is indeed open to the sea and may be vulnerable to seawater intrusion.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Soil organic matter (SOM) is important to fertility, since it performs several functions such as cycling, water and nutrient retention and soil aggregation, in addition to being an energy requirement for biological activity. This study proposes new trends to the Embrapa, Walkley-Black, and Mebius methods that allowed the determination of SOM by spectrophotometry, increasing functionality. The mass of 500 mg was reduced to 200 mg, generating a mean of 60 % saving of reagents and a decrease of 91 % in the volume of residue generated for the three methods without compromising accuracy and precision. We were able to optimize conditions for the Mebius method and establish the digestion time of maximum recovery of SOM by factorial design and response surface. The methods were validated by the estimate of figures of merits. Between the methods investigated, the optimized Mebius method was best suited for determining SOM, showing near 100 % recovery.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Heavy Liquid Metal Cooled Reactors are among the concepts, fostered by the GIF, as potentially able to comply with stringent safety, economical, sustainability, proliferation resistance and physical protection requirements. The increasing interest around these innovative systems has highlighted the lack of tools specifically dedicated to their core design stage. The present PhD thesis summarizes the three years effort of, partially, closing the mentioned gap, by rationally defining the role of codes in core design and by creating a development methodology for core design-oriented codes (DOCs) and its subsequent application to the most needed design areas. The covered fields are, in particular, the fuel assembly thermal-hydraulics and the fuel pin thermo-mechanics. Regarding the former, following the established methodology, the sub-channel code ANTEO+ has been conceived. Initially restricted to the forced convection regime and subsequently extended to the mixed one, ANTEO+, via a thorough validation campaign, has been demonstrated a reliable tool for design applications. Concerning the fuel pin thermo-mechanics, the will to include safety-related considerations at the outset of the pin dimensioning process, has given birth to the safety-informed DOC TEMIDE. The proposed DOC development methodology has also been applied to TEMIDE; given the complex interdependence patterns among the numerous phenomena involved in an irradiated fuel pin, to optimize the code final structure, a sensitivity analysis has been performed, in the anticipated application domain. The development methodology has also been tested in the verification and validation phases; the latter, due to the low availability of experiments truly representative of TEMIDE's application domain, has only been a preliminary attempt to test TEMIDE's capabilities in fulfilling the DOC requirements upon which it has been built. In general, the capability of the proposed development methodology for DOCs in delivering tools helping the core designer in preliminary setting the system configuration has been proven.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Besides increasing the share of electric and hybrid vehicles, in order to comply with more stringent environmental protection limitations, in the mid-term the auto industry must improve the efficiency of the internal combustion engine and the well to wheel efficiency of the employed fuel. To achieve this target, a deeper knowledge of the phenomena that influence the mixture formation and the chemical reactions involving new synthetic fuel components is mandatory, but complex and time intensive to perform purely by experimentation. Therefore, numerical simulations play an important role in this development process, but their use can be effective only if they can be considered accurate enough to capture these variations. The most relevant models necessary for the simulation of the reacting mixture formation and successive chemical reactions have been investigated in the present work, with a critical approach, in order to provide instruments to define the most suitable approaches also in the industrial context, which is limited by time constraints and budget evaluations. To overcome these limitations, new methodologies have been developed to conjugate detailed and simplified modelling techniques for the phenomena involving chemical reactions and mixture formation in non-traditional conditions (e.g. water injection, biofuels etc.). Thanks to the large use of machine learning and deep learning algorithms, several applications have been revised or implemented, with the target of reducing the computing time of some traditional tasks by orders of magnitude. Finally, a complete workflow leveraging these new models has been defined and used for evaluating the effects of different surrogate formulations of the same experimental fuel on a proof-of-concept GDI engine model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Cultural heritage is constituted by complex and heterogenous materials, such as paintings but also ancient remains. However, all ancient materials are exposed to external environment and their interaction produces different changes due to chemical, physical and biological phenomena. The organic fraction, especially the proteinaceous one, has a crucial role in all these materials: in archaeology proteins reveal human habits, in artworks they disclose technics and help for a correct restoration. For these reasons the development of methods that allow the preservation of the sample as much as possible and a deeper knowledge of the deterioration processes is fundamental. The research activities presented in this PhD thesis have been focused on the development of new immunochemical and spectroscopic approaches in order to detect and identify organic substances in artistic and archaeological samples. Organic components could be present in different cultural heritage materials as constituent element (e.g., binders in paintings, collagen in bones) and their knowledge is fundamental for a complete understanding of past life, degradation processes and appropriate restauration approaches. The combination of immunological approach with a chemiluminescence detection and Laser Ablation-Inductively Coupled Plasma-Mass Spectrometry allowed a sensitive and selective localization of collagen and elements in ancient bones and teeth. Near-infrared spectrometer and hyper spectral imaging have been applied in combination with chemometric data analysis as non-destructive methods for bones prescreening for the localization of collagen. Moreover, an investigation of amino acids in enamel has been proposed, in order to clarify teeth biomolecules survival overtime through the optimization and application of High-Performance Liquid Chromatography on modern and ancient enamel powder. New portable biosensors were developed for ovalbumin identification in paintings, thanks to the combination between biocompatible Gellan gel and electro-immunochemical sensors, to extract and identify painting binders with the contact only between gel and painting and between gel and electrodes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Noise is constant presence in measurements. Its origin is related to the microscopic properties of matter. Since the seminal work of Brown in 1828, the study of stochastic processes has gained an increasing interest with the development of new mathematical and analytical tools. In the last decades, the central role that noise plays in chemical and physiological processes has become recognized. The dual role of noise as nuisance/resource pushes towards the development of new decomposition techniques that divide a signal into its deterministic and stochastic components. In this thesis I show how methods based on Singular Spectrum Analysis have the right properties to fulfil the previously mentioned requirement. During my work I applied SSA to different signals of interest in chemistry: I developed a novel iterative procedure for the denoising of powder X-ray diffractograms; I “denoised” bi-dimensional images from experiments of electrochemiluminescence imaging of micro-beads obtaining new insight on ECL mechanism. I also used Principal Component Analysis to investigate the relationship between brain electrophysiological signals and voice emission.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Following the approval of the 2030 Agenda for Sustainable Development in 2015, sustainability became a hotly debated topic. In order to build a better and more sustainable future by 2030, this agenda addressed several global issues, including inequality, climate change, peace, and justice, in the form of 17 Sustainable Development Goals (SDGs), that should be understood and pursued by nations, corporations, institutions, and individuals. In this thesis, we researched how to exploit and integrate Human-Computer Interaction (HCI) and Data Visualization to promote knowledge and awareness about SDG 8, which wants to encourage lasting, inclusive, and sustainable economic growth, full and productive employment, and decent work for all. In particular, we focused on three targets: green economy, sustainable tourism, employment, decent work for all, and social protection. The primary goal of this research is to determine whether HCI approaches may be used to create and validate interactive data visualization that can serve as helpful decision-making aids for specific groups and raise their knowledge of public-interest issues. To accomplish this goal, we analyzed four case studies. In the first two, we wanted to promote knowledge and awareness about green economy issues: we investigated the Human-Building Interaction inside a Smart Campus and the dematerialization process inside a University. In the third, we focused on smart tourism, investigating the relationship between locals and tourists to create meaningful connections and promote more sustainable tourism. In the fourth, we explored the industry context to highlight sustainability policies inside well-known companies. This research focuses on the hypothesis that interactive data visualization tools can make communities aware of sustainability aspects related to SDG8 and its targets. The research questions addressed are two: "how to promote awareness about SDG8 and its targets through interactive data visualizations?" and "to what extent are these interactive data visualizations effective?".

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Machine Learning makes computers capable of performing tasks typically requiring human intelligence. A domain where it is having a considerable impact is the life sciences, allowing to devise new biological analysis protocols, develop patients’ treatments efficiently and faster, and reduce healthcare costs. This Thesis work presents new Machine Learning methods and pipelines for the life sciences focusing on the unsupervised field. At a methodological level, two methods are presented. The first is an “Ab Initio Local Principal Path” and it is a revised and improved version of a pre-existing algorithm in the manifold learning realm. The second contribution is an improvement over the Import Vector Domain Description (one-class learning) through the Kullback-Leibler divergence. It hybridizes kernel methods to Deep Learning obtaining a scalable solution, an improved probabilistic model, and state-of-the-art performances. Both methods are tested through several experiments, with a central focus on their relevance in life sciences. Results show that they improve the performances achieved by their previous versions. At the applicative level, two pipelines are presented. The first one is for the analysis of RNA-Seq datasets, both transcriptomic and single-cell data, and is aimed at identifying genes that may be involved in biological processes (e.g., the transition of tissues from normal to cancer). In this project, an R package is released on CRAN to make the pipeline accessible to the bioinformatic Community through high-level APIs. The second pipeline is in the drug discovery domain and is useful for identifying druggable pockets, namely regions of a protein with a high probability of accepting a small molecule (a drug). Both these pipelines achieve remarkable results. Lastly, a detour application is developed to identify the strengths/limitations of the “Principal Path” algorithm by analyzing Convolutional Neural Networks induced vector spaces. This application is conducted in the music and visual arts domains.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In medicine, innovation depends on a better knowledge of the human body mechanism, which represents a complex system of multi-scale constituents. Unraveling the complexity underneath diseases proves to be challenging. A deep understanding of the inner workings comes with dealing with many heterogeneous information. Exploring the molecular status and the organization of genes, proteins, metabolites provides insights on what is driving a disease, from aggressiveness to curability. Molecular constituents, however, are only the building blocks of the human body and cannot currently tell the whole story of diseases. This is why nowadays attention is growing towards the contemporary exploitation of multi-scale information. Holistic methods are then drawing interest to address the problem of integrating heterogeneous data. The heterogeneity may derive from the diversity across data types and from the diversity within diseases. Here, four studies conducted data integration using customly designed workflows that implement novel methods and views to tackle the heterogeneous characterization of diseases. The first study devoted to determine shared gene regulatory signatures for onco-hematology and it showed partial co-regulation across blood-related diseases. The second study focused on Acute Myeloid Leukemia and refined the unsupervised integration of genomic alterations, which turned out to better resemble clinical practice. In the third study, network integration for artherosclerosis demonstrated, as a proof of concept, the impact of network intelligibility when it comes to model heterogeneous data, which showed to accelerate the identification of new potential pharmaceutical targets. Lastly, the fourth study introduced a new method to integrate multiple data types in a unique latent heterogeneous-representation that facilitated the selection of important data types to predict the tumour stage of invasive ductal carcinoma. The results of these four studies laid the groundwork to ease the detection of new biomarkers ultimately beneficial to medical practice and to the ever-growing field of Personalized Medicine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new flow procedure based on multicommutation with chemiluminometric detection was developed to quantify gentamicin sulphate in pharmaceutical formulations. This approach is based on gentamicin's ability to inhibit the chemiluminometric reaction between luminol and hypochlorite in alkaline medium, causing a decrease in the analytical signal. The inhibition of the analytical signal is proportional to the concentration of gentamicin sulphate, within a linear range of 1 to 4 mu g mL(-1) with a coefficient variation <3%. A sample throughput of 55 samples h(-1) was obtained. The developed method is sensitive, simple, with low reagent consumption, reproducible, and inexpensive, and when applied to the analysis of pharmaceutical formulations (eye drops and injections) it gave results with RSD between 1.10 and 4.40%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The tomato (Solanum lycopersicum L.) plant is both an economically important food crop and an ideal dicot model to investigate various physiological phenomena not possible in Arabidopsis thaliana. Due to the great diversity of tomato cultivars used by the research community, it is often difficult to reliably compare phenotypes. The lack of tomato developmental mutants in a single genetic background prevents the stacking of mutations to facilitate analysis of double and multiple mutants, often required for elucidating developmental pathways. Results: We took advantage of the small size and rapid life cycle of the tomato cultivar Micro-Tom (MT) to create near-isogenic lines (NILs) by introgressing a suite of hormonal and photomorphogenetic mutations (altered sensitivity or endogenous levels of auxin, ethylene, abscisic acid, gibberellin, brassinosteroid, and light response) into this genetic background. To demonstrate the usefulness of this collection, we compared developmental traits between the produced NILs. All expected mutant phenotypes were expressed in the NILs. We also created NILs harboring the wild type alleles for dwarf, self-pruning and uniform fruit, which are mutations characteristic of MT. This amplified both the applications of the mutant collection presented here and of MT as a genetic model system. Conclusions: The community resource presented here is a useful toolkit for plant research, particularly for future studies in plant development, which will require the simultaneous observation of the effect of various hormones, signaling pathways and crosstalk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Premise of study: Microsatellite primers were developed for castor bean (Ricinus communis L.) to investigate genetic diversity and population structure, and to provide support to germplasm management. Methods and Results: Eleven microsatellite loci were isolated using an enrichment cloning protocol and used to characterize castor bean germplasm from the collection at the Instituto Agronomico de Campinas (IAC). In a survey of 76 castor bean accessions, the investigated loci displayed polymorphism ranging from two to five alleles. Conclusions: The information derived from microsatellite markers led to significant gains in conserved allelic richness and provides support to the implementation of several molecular breeding strategies for castor bean.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Premise of the study: Microsatellite primers were developed for Aulonemia aristulata, an endangered species of economic interest, to further describe its genetic variability and population structure. We also tested cross-amplification in 18 other bamboo species. Methods and Results: Using an enrichment genomic library, 13 microsatellite loci were isolated and characterized in A. aristulata. Seven of these loci were polymorphic. Twelve markers were cross-amplified in at least ten of the tested bamboo species. Conclusions: These markers will be useful for studies on the genetic diversity and structure of A. aristulata, which are important for future conservation, management and breeding programs of this species.