871 resultados para Direct theorem
Resumo:
P. E. Parvanov - The uniform weighted approximation errors of the Goodman–Sharma operators are characterized for functions.
Resumo:
AMS classification: 41A36, 41A10, 41A25, 41Al7.
Resumo:
We study the singular Bott-Chern classes introduced by Bismut, Gillet and Soulé. Singular Bott-Chern classes are the main ingredient to define direct images for closed immersions in arithmetic K-theory. In this paper we give an axiomatic definition of a theory of singular Bott-Chern classes, study their properties, and classify all possible theories of this kind. We identify the theory defined by Bismut, Gillet and Soulé as the only one that satisfies the additional condition of being homogeneous. We include a proof of the arithmetic Grothendieck-Riemann-Roch theorem for closed immersions that generalizes a result of Bismut, Gillet and Soulé and was already proved by Zha. This result can be combined with the arithmetic Grothendieck-Riemann-Roch theorem for submersions to extend this theorem to arbitrary projective morphisms. As a byproduct of this study we obtain two results of independent interest. First, we prove a Poincaré lemma for the complex of currents with fixed wave front set, and second we prove that certain direct images of Bott-Chern classes are closed.
Resumo:
We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.
Resumo:
The problem of symmetric stability is examined within the context of the direct Liapunov method. The sufficient conditions for stability derived by Fjørtoft are shown to imply finite-amplitude, normed stability. This finite-amplitude stability theorem is then used to obtain rigorous upper bounds on the saturation amplitude of disturbances to symmetrically unstable flows.By employing a virial functional, the necessary conditions for instability implied by the stability theorem are shown to be in fact sufficient for instability. The results of Ooyama are improved upon insofar as a tight two-sided (upper and lower) estimate is obtained of the growth rate of (modal or nonmodal) symmetric instabilities.The case of moist adiabatic systems is also considered.
Resumo:
Matita (that means pencil in Italian) is a new interactive theorem prover under development at the University of Bologna. When compared with state-of-the-art proof assistants, Matita presents both traditional and innovative aspects. The underlying calculus of the system, namely the Calculus of (Co)Inductive Constructions (CIC for short), is well-known and is used as the basis of another mainstream proof assistant—Coq—with which Matita is to some extent compatible. In the same spirit of several other systems, proof authoring is conducted by the user as a goal directed proof search, using a script for storing textual commands for the system. In the tradition of LCF, the proof language of Matita is procedural and relies on tactic and tacticals to proceed toward proof completion. The interaction paradigm offered to the user is based on the script management technique at the basis of the popularity of the Proof General generic interface for interactive theorem provers: while editing a script the user can move forth the execution point to deliver commands to the system, or back to retract (or “undo”) past commands. Matita has been developed from scratch in the past 8 years by several members of the Helm research group, this thesis author is one of such members. Matita is now a full-fledged proof assistant with a library of about 1.000 concepts. Several innovative solutions spun-off from this development effort. This thesis is about the design and implementation of some of those solutions, in particular those relevant for the topic of user interaction with theorem provers, and of which this thesis author was a major contributor. Joint work with other members of the research group is pointed out where needed. The main topics discussed in this thesis are briefly summarized below. Disambiguation. Most activities connected with interactive proving require the user to input mathematical formulae. Being mathematical notation ambiguous, parsing formulae typeset as mathematicians like to write down on paper is a challenging task; a challenge neglected by several theorem provers which usually prefer to fix an unambiguous input syntax. Exploiting features of the underlying calculus, Matita offers an efficient disambiguation engine which permit to type formulae in the familiar mathematical notation. Step-by-step tacticals. Tacticals are higher-order constructs used in proof scripts to combine tactics together. With tacticals scripts can be made shorter, readable, and more resilient to changes. Unfortunately they are de facto incompatible with state-of-the-art user interfaces based on script management. Such interfaces indeed do not permit to position the execution point inside complex tacticals, thus introducing a trade-off between the usefulness of structuring scripts and a tedious big step execution behavior during script replaying. In Matita we break this trade-off with tinycals: an alternative to a subset of LCF tacticals which can be evaluated in a more fine-grained manner. Extensible yet meaningful notation. Proof assistant users often face the need of creating new mathematical notation in order to ease the use of new concepts. The framework used in Matita for dealing with extensible notation both accounts for high quality bidimensional rendering of formulae (with the expressivity of MathMLPresentation) and provides meaningful notation, where presentational fragments are kept synchronized with semantic representation of terms. Using our approach interoperability with other systems can be achieved at the content level, and direct manipulation of formulae acting on their rendered forms is possible too. Publish/subscribe hints. Automation plays an important role in interactive proving as users like to delegate tedious proving sub-tasks to decision procedures or external reasoners. Exploiting the Web-friendliness of Matita we experimented with a broker and a network of web services (called tutors) which can try independently to complete open sub-goals of a proof, currently being authored in Matita. The user receives hints from the tutors on how to complete sub-goals and can interactively or automatically apply them to the current proof. Another innovative aspect of Matita, only marginally touched by this thesis, is the embedded content-based search engine Whelp which is exploited to various ends, from automatic theorem proving to avoiding duplicate work for the user. We also discuss the (potential) reusability in other systems of the widgets presented in this thesis and how we envisage the evolution of user interfaces for interactive theorem provers in the Web 2.0 era.
Resumo:
The Direct Boundary Element Method (DBEM) is presented to solve the elastodynamic field equations in 2D, and a complete comprehensive implementation is given. The DBEM is a useful approach to obtain reliable numerical estimates of site effects on seismic ground motion due to irregular geological configurations, both of layering and topography. The method is based on the discretization of the classical Somigliana's elastodynamic representation equation which stems from the reciprocity theorem. This equation is given in terms of the Green's function which is the full-space harmonic steady-state fundamental solution. The formulation permits the treatment of viscoelastic media, therefore site models with intrinsic attenuation can be examined. By means of this approach, the calculation of 2D scattering of seismic waves, due to the incidence of P and SV waves on irregular topographical profiles is performed. Sites such as, canyons, mountains and valleys in irregular multilayered media are computed to test the technique. The obtained transfer functions show excellent agreement with already published results.
Resumo:
* This paper was supported in part by the Bulgarian Ministry of Education, Science and Technologies under contract MM-506/95.
Direct Visualization Of The Action Of Triton X-100 On Giant Vesicles Of Erythrocyte Membrane Lipids.
Resumo:
The raft hypothesis proposes that microdomains enriched in sphingolipids, cholesterol, and specific proteins are transiently formed to accomplish important cellular tasks. Equivocally, detergent-resistant membranes were initially assumed to be identical to membrane rafts, because of similarities between their compositions. In fact, the impact of detergents in membrane organization is still controversial. Here, we use phase contrast and fluorescence microscopy to observe giant unilamellar vesicles (GUVs) made of erythrocyte membrane lipids (erythro-GUVs) when exposed to the detergent Triton X-100 (TX-100). We clearly show that TX-100 has a restructuring action on biomembranes. Contact with TX-100 readily induces domain formation on the previously homogeneous membrane of erythro-GUVs at physiological and room temperatures. The shape and dynamics of the formed domains point to liquid-ordered/liquid-disordered (Lo/Ld) phase separation, typically found in raft-like ternary lipid mixtures. The Ld domains are then separated from the original vesicle and completely solubilized by TX-100. The insoluble vesicle left, in the Lo phase, represents around 2/3 of the original vesicle surface at room temperature and decreases to almost 1/2 at physiological temperature. This chain of events could be entirely reproduced with biomimetic GUVs of a simple ternary lipid mixture, 2:1:2 POPC/SM/chol (phosphatidylcholine/sphyngomyelin/cholesterol), showing that this behavior will arise because of fundamental physicochemical properties of simple lipid mixtures. This work provides direct visualization of TX-100-induced domain formation followed by selective (Ld phase) solubilization in a model system with a complex biological lipid composition.
Resumo:
Excessive occlusal surface wear can result in occlusal disharmony, functional and esthetic impairment. As a therapeutic approach, conventional single crowns have been proposed, but this kind of treatment is complex, highly invasive and expensive. This case report describes the clinical outcomes of an alternative minimally invasive treatment based on direct adhesive-pin retained restorations. A 64-year-old woman with severely worn dentition, eating problems related to missing teeth and generalized tooth hypersensitivity was referred for treatment. Proper treatment planning based on the diagnostic wax-up simulation was used to guide the reconstruction of maxillary anterior teeth with direct composite resin over self-threading dentin pins. As the mandibular remaining teeth were extremely worn, a tooth-supported overdenture was installed. A stabilization splint was also used to protect the restorations. This treatment was a less expensive alternative to full-mouth rehabilitation with positive esthetic and functional outcomes after 1.5 years of follow-up.
Resumo:
Using a desorption/ionization technique, easy ambient sonic-spray ionization coupled to mass spectrometry (EASI-MS), documents related to the 2nd generation of Brazilian Real currency (R$) were screened in the positive ion mode for authenticity based on chemical profiles obtained directly from the banknote surface. Characteristic profiles were observed for authentic, seized suspect counterfeit and counterfeited homemade banknotes from inkjet and laserjet printers. The chemicals in the authentic banknotes' surface were detected via a few minor sets of ions, namely from the plasticizers bis(2-ethylhexyl)phthalate (DEHP) and dibutyl phthalate (DBP), most likely related to the official offset printing process, and other common quaternary ammonium cations, presenting a similar chemical profile to 1st-generation R$. The seized suspect counterfeit banknotes, however, displayed abundant diagnostic ions in the m/z 400-800 range due to the presence of oligomers. High-accuracy FT-ICR MS analysis enabled molecular formula assignment for each ion. The ions were separated by 44 m/z, which enabled their characterization as Surfynol® 4XX (S4XX, XX=40, 65, and 85), wherein increasing XX values indicate increasing amounts of ethoxylation on a backbone of 2,4,7,9-tetramethyl-5-decyne-4,7-diol (Surfynol® 104). Sodiated triethylene glycol monobutyl ether (TBG) of m/z 229 (C10H22O4Na) was also identified in the seized counterfeit banknotes via EASI(+) FT-ICR MS. Surfynol® and TBG are constituents of inks used for inkjet printing.
Resumo:
X-ray fluorescence (XRF) is a fast, low-cost, nondestructive, and truly multielement analytical technique. The objectives of this study are to quantify the amount of Na(+) and K(+) in samples of table salt (refined, marine, and light) and to compare three different methodologies of quantification using XRF. A fundamental parameter method revealed difficulties in quantifying accurately lighter elements (Z < 22). A univariate methodology based on peak area calibration is an attractive alternative, even though additional steps of data manipulation might consume some time. Quantifications were performed with good correlations for both Na (r = 0.974) and K (r = 0.992). A partial least-squares (PLS) regression method with five latent variables was very fast. Na(+) quantifications provided calibration errors lower than 16% and a correlation of 0.995. Of great concern was the observation of high Na(+) levels in low-sodium salts. The presented application may be performed in a fast and multielement fashion, in accordance with Green Chemistry specifications.
Resumo:
OBJECTIVES: The purpose of this study was to assess the color change of three types of composite resins exposed to coffee and cola drink, and the effect of repolishing on the color stability of these composites after staining. MATERIALS AND METHODS: Fifteen specimens (15 mm diameter and 2 mm thick) were fabricated from microhybrid (Esthet-X; Dentsply and Filtek Z-250; 3M ESPE) and high-density hybrid (Surefil; Dentsply) composites, and were finished and polished with aluminum oxide discs (Sof-Lex; 3M ESPE). Color of the specimens was measured according to the CIE L*a*b* system in a refection spectrophotometer (PCB 6807; BYK Gardner). After baseline color measurements, 5 specimens of each resin were immersed in different staining solutions for 15 days: G1 - distilled water (control), G2 - coffee, G3 - cola soft drink. Afterwards, new color measurement was performed and the specimens were repolished and submitted to new color reading. Color stability was determined by the difference (ΔE) between the coordinates L*, a*, and b* obtained from the specimens before and after immersion into the solutions and after repolishing. RESULTS: There was no statistically signifcant difference (ANOVA, Tukey's test; p>0.05) among the ΔE values for the different types of composites after staining or repolishing. For all composite resins, coffee promoted more color change (ΔE>3.3) than distilled water and the cola soft drink. After repolishing, the ΔE values of the specimens immersed in coffee decreased to clinically acceptable values (ΔE<3.3), but remained signifcantly higher than those of the other groups. CONCLUSIONS: No signifcant difference was found among composite resins or between color values before and after repolishing of specimens immersed in distilled water and cola. Immersing specimens in coffee caused greater color change in all types of composite resins tested in this study and repolishing contributed to decrease staining to clinically acceptable ΔE values.
Resumo:
The purpose of this study was to evaluate the flexural strength of a direct composite, for indirect application, that received heat treatment, with or without investment. One indirect composite was used for comparison. For determination of the heat treatment temperature, thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC) were performed, considering the initial weight loss temperature and glass transition temperature (Tg). Then, after photoactivation (600 mW/cm² - 40 s), the specimens (10 x 2 x 2 mm) were heat-treated following these conditions: 170ºC for 5, 10 or 15 min, embedded or not embedded in investment. Flexural strength was assessed as a means to evaluate the influence of different heat treatment periods and investment embedding on mechanical properties. The data were analyzed by ANOVA and Tukey's test (α = 0.05). TGA showed an initial weight loss temperature of 180ºC and DSC showed a Tg value of 157°C. Heat treatment was conducted in an oven (Flli Manfredi, Italy), after 37°C storage for 48 h. Flexural strength was evaluated after 120 h at 37°C storage. The results showed that different periods and investment embedding presented similar statistical values. Nevertheless, the direct composite resin with treatments presented higher values (178.7 MPa) compared to the indirect composite resin (146.0 MPa) and the same direct composite submitted to photoactivation only (151.7 MPa). Within the limitations of this study, it could be concluded that the heat treatment increased the flexural strength of the direct composite studied, leading to higher mechanical strength compared to the indirect composite.
Resumo:
The objective of this study was to evaluate the flexural strength (σf) and hardness (H) of direct and indirect composites, testing the hypotheses that direct resin composites produce higher σf and H values than indirect composites and that these properties are positively related. Ten bar-shaped specimens (25 mm x 2 mm x 2 mm) were fabricated for each direct [D250 - Filtek Z250 (3M-Espe) and D350 - Filtek Z350 (3M-Espe)] and indirect [ISin - Sinfony (3M-Espe) and IVM - VitaVM LC (Vita Zahnfabrik)] materials, according to the manufacturer's instructions and ISO4049 specifications. The σf was tested in three-point bending using a universal testing machine (EMIC DL 2000) at a crosshead speed of 0.5 mm/min (ISO4049). Knoop hardness (H) was measured on the specimens' fragments resultant from the σf test and calculated as H = 14.2P/l², where P is the applied load (0.1 kg; dwell time = 15 s) and l is the longest diagonal of the diamond shaped indent (ASTM E384). The data were statistically analyzed using Anova and Tukey tests (α = 0.05). The mean σf and standard deviation values (MPa) and statistical grouping were: D250 - 135.4 ± 17.6a; D350 - 123.7 ± 11.1b; ISin - 98.4 ± 6.4c; IVM - 73.1 ± 4.9d. The mean H and standard deviation values (kg/mm²) and statistical grouping were: D250 - 98.12 ± 1.8a; D350 - 86.5 ± 1.9b; ISin - 28.3 ± 0.9c; IVM - 30.8 ± 1.0c. The direct composite systems examined produce higher mean σf and H values than the indirect composites, and the mean values of these properties were positively correlated (r = 0.91), confirming the study hypotheses.