935 resultados para Shannon Sampling Theorem
Resumo:
This thesis covers sampling and analytical procedures for isocyanates (R-NCO) and amines (R-NH2), two kinds of chemicals frequently used in association with the polymeric material polyurethane (PUR). Exposure to isocyanates may result in respiratory disorders and dermal sensitisation, and they are one of the main causes of occupational asthma. Several of the aromatic diamines associated with PUR production are classified as suspected carcinogens. Hence, the presence of these chemicals in different exposure situations must be monitored. In the context of determining isocyanates in air, the methodologies included derivatisation with the reagent di-n-butylamine (DBA) upon collection and subsequent determination using liquid chromatography (LC) and mass spectrometric detection (MS). A user-friendly solvent-free sampler for collection of airborne isocyanates was developed as an alternative to a more cumbersome impinger-filter sampling technique. The combination of the DBA reagent together with MS detection techniques revealed several new exposure situations for isocyanates, such as isocyanic acid during thermal degradation of PUR and urea-based resins. Further, a method for characterising isocyanates in technical products used in the production of PUR was developed. This enabled determination of isocyanates in air for which pure analytical standards are missing. Tandem MS (MS/MS) determination of isocyanates in air below 10-6 of the threshold limit values was achieved. As for the determination of amines, the analytical methods included derivatisation into pentafluoropropionic amide or ethyl carbamate ester derivatives and subsequent MS analysis. Several amines in biological fluids, as markers of exposure for either the amines themselves or the corresponding isocyanates, were determined by LC-MS/MS at amol level. In aqueous extraction solutions of flexible PUR foam products, toluene diamine and related compounds were found. In conclusion, this thesis demonstrates the usefulness of well characterised analytical procedures and techniques for determination of hazardous compounds. Without reliable and robust methodologies there is a risk that exposure levels will be underestimated or, even worse, that relevant compounds will be completely missed.
Resumo:
[EN] The purpose of this paper is to present some fixed point theorems for Meir-Keeler contractions in a complete metric space endowed with a partial order. MSC: 47H10.
Resumo:
[EN] The purpose of this paper is to provide sufficient conditions for the existence of a unique best proximity point for Geraghty-contractions.Our paper provides an extension of a result due to Geraghty (Proc. Am. Math. Soc. 40:604-608, 1973).
Resumo:
Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.
Resumo:
Interactive theorem provers (ITP for short) are tools whose final aim is to certify proofs written by human beings. To reach that objective they have to fill the gap between the high level language used by humans for communicating and reasoning about mathematics and the lower level language that a machine is able to “understand” and process. The user perceives this gap in terms of missing features or inefficiencies. The developer tries to accommodate the user requests without increasing the already high complexity of these applications. We believe that satisfactory solutions can only come from a strong synergy between users and developers. We devoted most part of our PHD designing and developing the Matita interactive theorem prover. The software was born in the computer science department of the University of Bologna as the result of composing together all the technologies developed by the HELM team (to which we belong) for the MoWGLI project. The MoWGLI project aimed at giving accessibility through the web to the libraries of formalised mathematics of various interactive theorem provers, taking Coq as the main test case. The motivations for giving life to a new ITP are: • study the architecture of these tools, with the aim of understanding the source of their complexity • exploit such a knowledge to experiment new solutions that, for backward compatibility reasons, would be hard (if not impossible) to test on a widely used system like Coq. Matita is based on the Curry-Howard isomorphism, adopting the Calculus of Inductive Constructions (CIC) as its logical foundation. Proof objects are thus, at some extent, compatible with the ones produced with the Coq ITP, that is itself able to import and process the ones generated using Matita. Although the systems have a lot in common, they share no code at all, and even most of the algorithmic solutions are different. The thesis is composed of two parts where we respectively describe our experience as a user and a developer of interactive provers. In particular, the first part is based on two different formalisation experiences: • our internship in the Mathematical Components team (INRIA), that is formalising the finite group theory required to attack the Feit Thompson Theorem. To tackle this result, giving an effective classification of finite groups of odd order, the team adopts the SSReflect Coq extension, developed by Georges Gonthier for the proof of the four colours theorem. • our collaboration at the D.A.M.A. Project, whose goal is the formalisation of abstract measure theory in Matita leading to a constructive proof of Lebesgue’s Dominated Convergence Theorem. The most notable issues we faced, analysed in this part of the thesis, are the following: the difficulties arising when using “black box” automation in large formalisations; the impossibility for a user (especially a newcomer) to master the context of a library of already formalised results; the uncomfortable big step execution of proof commands historically adopted in ITPs; the difficult encoding of mathematical structures with a notion of inheritance in a type theory without subtyping like CIC. In the second part of the manuscript many of these issues will be analysed with the looking glasses of an ITP developer, describing the solutions we adopted in the implementation of Matita to solve these problems: integrated searching facilities to assist the user in handling large libraries of formalised results; a small step execution semantic for proof commands; a flexible implementation of coercive subtyping allowing multiple inheritance with shared substructures; automatic tactics, integrated with the searching facilities, that generates proof commands (and not only proof objects, usually kept hidden to the user) one of which specifically designed to be user driven.
Resumo:
Proper ion channels’ functioning is a prerequisite for a normal cell and disorders involving ion channels, or channelopathies, underlie many human diseases. Long QT syndromes (LQTS) for example may arise from the malfunctioning of hERG channel, caused either by the binding of drugs or mutations in HERG gene. In the first part of this thesis I present a framework to investigate the mechanism of ion conduction through hERG channel. The free energy profile governing the elementary steps of ion translocation in the pore was computed by means of umbrella sampling simulations. Compared to previous studies, we detected a different dynamic behavior: according to our data hERG is more likely to mediate a conduction mechanism which has been referred to as “single-vacancy-like” by Roux and coworkers (2001), rather then a “knock-on” mechanism. The same protocol was applied to a model of hERG presenting the Gly628Ser mutation, found to be cause of congenital LQTS. The results provided interesting insights about the reason of the malfunctioning of the mutant channel. Since they have critical functions in viruses’ life cycle, viral ion channels, such as M2 proton channel, are considered attractive targets for antiviral therapy. A deep knowledge of the mechanisms that the virus employs to survive in the host cell is of primary importance in the identification of new antiviral strategies. In the second part of this thesis I shed light on the role that M2 plays in the control of electrical potential inside the virus, being the charge equilibration a condition required to allow proton influx. The ion conduction through M2 was simulated using metadynamics technique. Based on our results we suggest that a potential anion-mediated cation-proton exchange, as well as a direct anion-proton exchange could both contribute to explain the activity of the M2 channel.
Resumo:
An extensive sample (2%) of private vehicles in Italy are equipped with a GPS device that periodically measures their position and dynamical state for insurance purposes. Having access to this type of data allows to develop theoretical and practical applications of great interest: the real-time reconstruction of traffic state in a certain region, the development of accurate models of vehicle dynamics, the study of the cognitive dynamics of drivers. In order for these applications to be possible, we first need to develop the ability to reconstruct the paths taken by vehicles on the road network from the raw GPS data. In fact, these data are affected by positioning errors and they are often very distanced from each other (~2 Km). For these reasons, the task of path identification is not straightforward. This thesis describes the approach we followed to reliably identify vehicle paths from this kind of low-sampling data. The problem of matching data with roads is solved with a bayesian approach of maximum likelihood. While the identification of the path taken between two consecutive GPS measures is performed with a specifically developed optimal routing algorithm, based on A* algorithm. The procedure was applied on an off-line urban data sample and proved to be robust and accurate. Future developments will extend the procedure to real-time execution and nation-wide coverage.
Resumo:
Summary PhD Thesis Jan Pollmann: This thesis focuses on global scale measurements of light reactive non-methane hydrocarbon (NMHC), in the volatility range from ethane to toluene with a special focus on ethane, propane, isobutane, butane, isopentane and pentane. Even though they only occur at the ppt level (nmol mol-1) in the remote troposphere these species can yield insight into key atmospheric processes. An analytical method was developed and subsequently evaluated to analyze NMHC from the NOAA – ERSL cooperative air sampling network. Potential analytical interferences through other atmospheric trace gases (water vapor and ozone) were carefully examined. The analytical parameters accuracy and precision were analyzed in detail. It was proven that more than 90% of the data points meet the Global Atmospheric Watch (GAW) data quality objective. Trace gas measurements from 28 measurement stations were used to derive the global atmospheric distribution profile for 4 NMHC (ethane, propane, isobutane, butane). A close comparison of the derived ethane data with previously published reports showed that northern hemispheric ethane background mixing ratio declined by approximately 30% since 1990. No such change was observed for southern hemispheric ethane. The NMHC data and trace gas data supplied by NOAA ESRL were used to estimate local diurnal averaged hydroxyl radical (OH) mixing ratios by variability analysis. Comparison of the variability derived OH with directly measured OH and modeled OH mixing ratios were found in good agreement outside the tropics. Tropical OH was on average two times higher than predicted by the model. Variability analysis was used to assess the effect of chlorine radicals on atmospheric oxidation chemistry. It was found that Cl is probably not of significant relevance on a global scale.
Resumo:
In this study a new, fully non-linear, approach to Local Earthquake Tomography is presented. Local Earthquakes Tomography (LET) is a non-linear inversion problem that allows the joint determination of earthquakes parameters and velocity structure from arrival times of waves generated by local sources. Since the early developments of seismic tomography several inversion methods have been developed to solve this problem in a linearized way. In the framework of Monte Carlo sampling, we developed a new code based on the Reversible Jump Markov Chain Monte Carlo sampling method (Rj-McMc). It is a trans-dimensional approach in which the number of unknowns, and thus the model parameterization, is treated as one of the unknowns. I show that our new code allows overcoming major limitations of linearized tomography, opening a new perspective in seismic imaging. Synthetic tests demonstrate that our algorithm is able to produce a robust and reliable tomography without the need to make subjective a-priori assumptions about starting models and parameterization. Moreover it provides a more accurate estimate of uncertainties about the model parameters. Therefore, it is very suitable for investigating the velocity structure in regions that lack of accurate a-priori information. Synthetic tests also reveal that the lack of any regularization constraints allows extracting more information from the observed data and that the velocity structure can be detected also in regions where the density of rays is low and standard linearized codes fails. I also present high-resolution Vp and Vp/Vs models in two widespread investigated regions: the Parkfield segment of the San Andreas Fault (California, USA) and the area around the Alto Tiberina fault (Umbria-Marche, Italy). In both the cases, the models obtained with our code show a substantial improvement in the data fit, if compared with the models obtained from the same data set with the linearized inversion codes.
Resumo:
For the detection of hidden objects by low-frequency electromagnetic imaging the Linear Sampling Method works remarkably well despite the fact that the rigorous mathematical justification is still incomplete. In this work, we give an explanation for this good performance by showing that in the low-frequency limit the measurement operator fulfills the assumptions for the fully justified variant of the Linear Sampling Method, the so-called Factorization Method. We also show how the method has to be modified in the physically relevant case of electromagnetic imaging with divergence-free currents. We present numerical results to illustrate our findings, and to show that similar performance can be expected for the case of conducting objects and layered backgrounds.
Resumo:
We consider a simple (but fully three-dimensional) mathematical model for the electromagnetic exploration of buried, perfect electrically conducting objects within the soil underground. Moving an electric device parallel to the ground at constant height in order to generate a magnetic field, we measure the induced magnetic field within the device, and factor the underlying mathematics into a product of three operations which correspond to the primary excitation, some kind of reflection on the surface of the buried object(s) and the corresponding secondary excitation, respectively. Using this factorization we are able to give a justification of the so-called sampling method from inverse scattering theory for this particular set-up.
Resumo:
Wir untersuchen die numerische Lösung des inversen Streuproblems der Rekonstruktion der Form, Position und Anzahl endlich vieler perfekt leitender Objekte durch Nahfeldmessungen zeitharmonischer elektromagnetischer Wellen mit Hilfe von Metalldetektoren. Wir nehmen an, dass sich die Objekte gänzlich im unteren Halbraum eines unbeschränkten zweischichtigen Hintergrundmediums befinden. Wir nehmen weiter an, dass der obere Halbraum mit Luft und der untere Halbraum mit Erde gefüllt ist. Wir betrachten zuerst die physikalischen Grundlagen elektromagnetischer Wellen, aus denen wir zunächst ein vereinfachtes mathematisches Modell ableiten, in welchem wir direkt das elektromagnetische Feld messen. Dieses Modell erweitern wir dann um die Messung des elektromagnetischen Feldes von Sendespulen mit Hilfe von Empfangsspulen. Für das vereinfachte Modell entwickeln wir, unter Verwendung der Theorie des zugehörigen direkten Streuproblems, ein nichtiteratives Verfahren, das auf der Idee der sogenannten Faktorisierungsmethode beruht. Dieses Verfahren übertragen wir dann auf das erweiterte Modell. Wir geben einen Implementierungsvorschlag der Rekonstruktionsmethode und demonstrieren an einer Reihe numerischer Experimente die Anwendbarkeit des Verfahrens. Weiterhin untersuchen wir mehrere Abwandlungen der Methode zur Verbesserung der Rekonstruktionen und zur Verringerung der Rechenzeit.
Resumo:
Plant communities on weathered rock and outcrops are characterized by high values in species richness (Dengler 2006) and often persist on small and fragmented surfaces. Yet very few studies have examined the relationships between heterogeneity and plant diversity at small scales, in particular in poor-nutrient and low productive environment (Shmida and Wilson 1985, Lundholm 2003). In order to assess these relationships both in space and time in relationship, two different approaches were employed in the present study, in two gypsum outcrops of Northern Apennine. Diachronic and synchronic samplings from April 2012 to March 2013 were performed. A 50x50 cm plot was used in both samplings such as the sampling unit base. The diachronic survey aims to investigate seasonal patterning of plant diversity by the use of images analysis techniques integrated with field data and considering also seasonal climatic trend, the substrate quality and its variation in time. The purpose of the further, synchronic sampling was to describe plant diversity pattern as a function of the environmental heterogeneity meaning in substrate typologies, soil depth and topographic features. Results showed that responses of diversity pattern depend both on the resources availability, environmental heterogeneity and the manner in which the different taxonomic group access to them during the year. Species richness and Shannon diversity were positively affected by increasing in substrate heterogeneity. Furthermore a good turnover in seasonal species occurrence was detected. This vegetation may be described by the coexistence of three groups of species which created a gradient from early colonization stages, characterized by greater slope and predominance of bare rock, gradually to situation of more developed soil.
Resumo:
In dieser Arbeit wird ein vergröbertes (engl. coarse-grained, CG) Simulationsmodell für Peptide in wässriger Lösung entwickelt. In einem CG Verfahren reduziert man die Anzahl der Freiheitsgrade des Systems, so dass manrngrössere Systeme auf längeren Zeitskalen untersuchen kann. Die Wechselwirkungspotentiale des CG Modells sind so aufgebaut, dass die Peptid Konformationen eines höher aufgelösten (atomistischen) Modells reproduziert werden.rnIn dieser Arbeit wird der Einfluss unterschiedlicher bindender Wechsel-rnwirkungspotentiale in der CG Simulation untersucht, insbesondere daraufhin,rnin wie weit das Konformationsgleichgewicht der atomistischen Simulation reproduziert werden kann. Im CG Verfahren verliert man per Konstruktionrnmikroskopische strukturelle Details des Peptids, zum Beispiel, Korrelationen zwischen Freiheitsgraden entlang der Peptidkette. In der Dissertationrnwird gezeigt, dass diese “verlorenen” Eigenschaften in einem Rückabbildungsverfahren wiederhergestellt werden können, in dem die atomistischen Freiheitsgrade wieder in die CG-Strukturen eingefügt werden. Dies gelingt, solange die Konformationen des CG Modells grundsätzlich gut mit der atomistischen Ebene übereinstimmen. Die erwähnten Korrelationen spielen einerngrosse Rolle bei der Bildung von Sekundärstrukturen und sind somit vonrnentscheidender Bedeutung für ein realistisches Ensemble von Peptidkonformationen. Es wird gezeigt, dass für eine gute Übereinstimmung zwischen CG und atomistischen Kettenkonformationen spezielle bindende Wechselwirkungen wie zum Beispiel 1-5 Bindungs- und 1,3,5-Winkelpotentiale erforderlich sind. Die intramolekularen Parameter (d.h. Bindungen, Winkel, Torsionen), die für kurze Oligopeptide parametrisiert wurden, sind übertragbarrnauf längere Peptidsequenzen. Allerdings können diese gebundenen Wechselwirkungen nur in Kombination mit solchen nichtbindenden Wechselwirkungspotentialen kombiniert werden, die bei der Parametrisierung verwendet werden, sind also zum Beispiel nicht ohne weiteres mit einem andere Wasser-Modell kombinierbar. Da die Energielandschaft in CG-Simulationen glatter ist als im atomistischen Modell, gibt es eine Beschleunigung in der Dynamik. Diese Beschleunigung ist unterschiedlich für verschiedene dynamische Prozesse, zum Beispiel für verschiedene Arten von Bewegungen (Rotation und Translation). Dies ist ein wichtiger Aspekt bei der Untersuchung der Kinetik von Strukturbildungsprozessen, zum Beispiel Peptid Aggregation.rn