966 resultados para Observational techniques and algorithms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this in vitro study was to compare the dimensional accuracy of a stone index and of 3 impression techniques (tapered impression copings, squared impression copings, and squared impression copings splinted with acrylic resin) associated with 3 pouring techniques (conventional, pouring using latex tubes fitted onto analogs, and pouring after joining the analogs with acrylic resin) for implant-supported prostheses. Materials and Methods: A mandibular brass cast with 4 stainless steel implant-abutment analogs, a framework, and 2 aluminum custom trays were fabricated. Polyether impression material was used for all impressions. Ten groups were formed (a control group and 9 test groups formed by combining each pouring technique and impression technique). Five casts were made per group for a total of 50 casts and 200 gap values (1 gap value for each implant-abutment analog). Results: The mean gap value with the index technique was 27.07 mu m. With the conventional pouring technique, the mean gap values were 116.97 mu m for the tapered group, 5784 mu m for the squared group, and 73.17 mu m for the squared splinted group. With pouring using latex tubes, the mean gap values were 65.69 mu m for the tapered group, 38.03 mu m for the squared group, and 82.47 mu m for the squared splinted group. With pouring after joining the analogs with acrylic resin, the mean gap values were 141.12 jum for the tapered group, 74.19 mu m for the squared group, and 104.67 mu m for the squared splinted group. No significant difference was detected among Index, squarellatex techniques, and master cast (P > .05). Conclusions: The most accurate impression technique utilized squared copings. The most accurate pouring technique for making the impression with tapered or squared copings utilized latex tubes. The pouring did not influence the accuracy of the stone casts when using splinted squared impression copings. Either the index technique or the use of squared coping combined with the latex-tube pouring technique are preferred methods for making implant-supported fixed restorations with dimensional accuracy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article introduces the software program called EthoSeq, which is designed to extract probabilistic behavioral sequences (tree-generated sequences, or TGSs) from observational data and to prepare a TGS-species matrix for phylogenetic analysis. The program uses Graph Theory algorithms to automatically detect behavioral patterns within the observational sessions. It includes filtering tools to adjust the search procedure to user-specified statistical needs. Preliminary analyses of data sets, such as grooming sequences in birds and foraging tactics in spiders, uncover a large number of TGSs which together yield single phylogenetic trees. An example of the use of the program is our analysis of felid grooming sequences, in which we have obtained 1,386 felid grooming TGSs for seven species, resulting in a single phylogeny. These results show that behavior is definitely useful in phylogenetic analysis. EthoSeq simplifies and automates such analyses, uncovers much of the hidden patterns of long behavioral sequences, and prepares this data for further analysis with standard phylogenetic programs. We hope it will encourage many empirical studies on the evolution of behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work has as objectives the implementation of a intelligent computational tool to identify the non-technical losses and to select its most relevant features, considering information from the database with industrial consumers profiles of a power company. The solution to this problem is not trivial and not of regional character, the minimization of non-technical loss represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. This work presents using the WEKA software to the proposed objective, comparing various classification techniques and optimization through intelligent algorithms, this way, can be possible to automate applications on Smart Grids. © 2012 IEEE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The general aim of this article is to describe the state-of-the-art of biocompatibility testing for dental materials, and present new strategies for improving operative dentistry techniques and the biocompatibility of dental materials as they relate to their interaction with the dentin-pulp complex.Methods. The literature was reviewed focusing on articles related to biocompatibilty testing, the dentin-pulp complex and new strategies and materials for operative dentistry. For this purpose, the PubMed database as well as 118 articles published in English from 1939 to 2014 were searched. Data concerning types of biological tests and standardization of in vitro and in vivo protocols employed to evaluate the cytotoxicity and biocompatibility of dental materials were also searched from the US Food and Drug Administration (FDA), International Standards Organization (ISO) and American National Standards Institute (ANSI).Results. While there is an ongoing search for feasible strategies in the molecular approach to direct the repair or regeneration of structures that form the oral tissues, it is necessary for professionals to master the clinical therapies available at present. In turn, these techniques must be applied based on knowledge of the morphological and physiological characteristics of the tissues involved, as well as the physical, mechanical and biologic properties of the biomaterials recommended for each specific situation. Thus, particularly within modern esthetic restorative dentistry, the use of minimally invasive operative techniques associated with the use of dental materials with excellent properties and scientifically proved by means of clinical and laboratory studies must be a routine for dentists. This professional and responsible attitude will certainly result in greater possibility of achieving clinical success, benefiting patients and dentists themselves.Signcance. This article provides a general and critical view of the relations that permeate the interaction between dental materials and the dentin-pulp complex, and establish real possibilities and strategies that favor biocompatibility of the present and new products used in Dentistry, which will certainly benefit clinicians and their patients. (C) 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Evolutionary algorithms have been widely used for Artificial Neural Networks (ANN) training, being the idea to update the neurons' weights using social dynamics of living organisms in order to decrease the classification error. In this paper, we have introduced Social-Spider Optimization to improve the training phase of ANN with Multilayer perceptrons, and we validated the proposed approach in the context of Parkinson's Disease recognition. The experimental section has been carried out against with five other well-known meta-heuristics techniques, and it has shown SSO can be a suitable approach for ANN-MLP training step.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

gamma Cas is the prototypical classical Be star and is recently best known for its variable hard X-ray emission. To elucidate the reasons for this emission, we mounted a multiwavelength campaign in 2010 centered around four XMM-Newton observations. The observational techniques included long baseline optical interferometry (LBOI) from two instruments at CHARA, photometry carried out by an automated photometric telescope and H alpha observations. Because gamma Cas is also known to be in a binary, we measured radial velocities from the H alpha line and redetermined its period as 203.55 +/- 0.20 days and its eccentricity as near zero. The LBOI observations suggest that the star's decretion disk was axisymmetric in 2010, has an system inclination angle near 45 degrees, and a larger radius than previously reported. In addition, the Be star began an "outburst" at the beginning of our campaign, made visible by a brightening and reddening of the disk during our campaign and beyond. Our analyses of the new high resolution spectra disclosed many attributes also found from spectra obtained in 2001 (Chandra) and 2004 (XMM-Newton). As well as a dominant hot (approximate to 14 keV) thermal component, the familiar attributes included: (i) a fluorescent feature of Fe K even stronger than observed at previous times; (ii) strong lines of N VII and Ne XI lines indicative of overabundances; and (iii) a subsolar Fe abundance from K-shell lines but a solar abundance from L-shell ions. We also found that two absorption columns are required to fit the continuum. While the first one maintained its historical average of 1 x 10(21) cm(-2), the second was very large and doubled to 7.4 x 10(23) cm(-2) during our X-ray observations. Although we found no clear relation between this column density and orbital phase, it correlates well with the disk brightening and reddening both in the 2010 and earlier observations. Thus, the inference from this study is that much (perhaps all?) of the X-ray emission from this source originates behind matter ejected by gamma Cas into our line of sight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Adequate polymerization plays an important role on the longevity of the composite resin restorations. Objectives: The aim of this study was to evaluate the effect of light-curing units, curing mode techniques and storage media on sorption, solubility and biaxial flexural strength (BFS) of a composite resin. Material and Methods: Two hundred and forty specimens were made of one composite resin (Esthet-X) in a stainless steel mold (2 mm x 8 mm 0), and divided into 24 groups (n=10) established according to the 4 study factors: light-curing units: quartz tungsten halogen (QTH) lamp and light-emitting diodes (LED); energy densities: 16 J/cm(2) and 20 J/cm(2); curing modes: conventional (CM) and pulse-delay (PD); and permeants: deionized water and 75% ethanol for 28 days. Sorption and solubility tests were performed according to ISO 4049:2000 specifications. All specimens were then tested for BFS according to ASTM F394-78 specification. Data were analyzed by three-way ANOVA followed by Tukey, Kruskal-Wallis and Mann-Whitney tests (alpha=0.05). Results: In general, no significant differences were found regarding sorption, solubility or BFS means for the light-curing units and curing modes (p>0.05). Only LED unit using 16 J/cm(2) and PD using 10 s produced higher sorption and solubility values than QTH. Otherwise, using CM (16 J/cm(2)), LED produced lower values of BFS than QTH (p<0.05). 75% ethanol permeant produced higher values of sorption and solubility and lower values of BFS than water (p<0.05). Conclusion: Ethanol storage media produced more damage on composite resin than water. In general the LED and QTH curing units using 16 and 20 J/cm(2) by CM and PD curing modes produced no influence on the sorption, solubility or BFS of the tested resin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Thesis, we investigate the cosmological co-evolution of supermassive black holes (BHs), Active Galactic Nuclei (AGN) and their hosting dark matter (DM) halos and galaxies, within the standard CDM scenario. We analyze both analytic, semi-analytic and hybrid techniques and use the most recent observational data available to constrain the assumptions underlying our models. First, we focus on very simple analytic models where the assembly of BHs is directly related to the merger history of DM haloes. For this purpose, we implement the two original analytic models of Wyithe & Loeb 2002 and Wyithe & Loeb 2003, compare their predictions to the AGN luminosity function and clustering data, and discuss possible modifications to the models that improve the match to the observation. Then we study more sophisticated semi-analytic models in which however the baryonic physics is neglected as well. Finally we improve the hybrid simulation of De Lucia & Blaizot 2007, adding new semi-analytical prescriptions to describe the BH mass accretion rate during each merger event and its conversion into radiation, and compare the derived BH scaling relations, fundamental plane and mass function, and the AGN luminosity function with observations. All our results support the following scenario: • The cosmological co-evolution of BHs, AGN and galaxies can be well described within the CDM model. • At redshifts z & 1, the evolution history of DM halo fully determines the overall properties of the BH and AGN populations. The AGN emission is triggered mainly by DM halo major mergers and, on average, AGN shine at their Eddington luminosity. • At redshifts z . 1, BH growth decouples from halo growth. Galaxy major mergers cannot constitute the only trigger to accretion episodes in this phase. • When a static hot halo has formed around a galaxy, a fraction of the hot gas continuously accretes onto the central BH, causing a low-energy “radio” activity at the galactic centre, which prevents significant gas cooling and thus limiting the mass of the central galaxies and quenching the star formation at late time. • The cold gas fraction accreted by BHs at high redshifts seems to be larger than at low redshifts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis the use of widefield imaging techniques and VLBI observations with a limited number of antennas are explored. I present techniques to efficiently and accurately image extremely large UV datasets. Very large VLBI datasets must be reduced into multiple, smaller datasets if today’s imaging algorithms are to be used to image them. I present a procedure for accurately shifting the phase centre of a visibility dataset. This procedure has been thoroughly tested and found to be almost two orders of magnitude more accurate than existing techniques. Errors have been found at the level of one part in 1.1 million. These are unlikely to be measurable except in the very largest UV datasets. Results of a four-station VLBI observation of a field containing multiple sources are presented. A 13 gigapixel image was constructed to search for sources across the entire primary beam of the array by generating over 700 smaller UV datasets. The source 1320+299A was detected and its astrometric position with respect to the calibrator J1329+3154 is presented. Various techniques for phase calibration and imaging across this field are explored including using the detected source as an in-beam calibrator and peeling of distant confusing sources from VLBI visibility datasets. A range of issues pertaining to wide-field VLBI have been explored including; parameterising the wide-field performance of VLBI arrays; estimating the sensitivity across the primary beam both for homogeneous and heterogeneous arrays; applying techniques such as mosaicing and primary beam correction to VLBI observations; quantifying the effects of time-average and bandwidth smearing; and calibration and imaging of wide-field VLBI datasets. The performance of a computer cluster at the Istituto di Radioastronomia in Bologna has been characterised with regard to its ability to correlate using the DiFX software correlator. Using existing software it was possible to characterise the network speed particularly for MPI applications. The capabilities of the DiFX software correlator, running on this cluster, were measured for a range of observation parameters and were shown to be commensurate with the generic performance parameters measured. The feasibility of an Italian VLBI array has been explored, with discussion of the infrastructure required, the performance of such an array, possible collaborations, and science which could be achieved. Results from a 22 GHz calibrator survey are also presented. 21 out of 33 sources were detected on a single baseline between two Italian antennas (Medicina to Noto). The results and discussions presented in this thesis suggest that wide-field VLBI is a technique whose time has finally come. Prospects for exciting new science are discussed in the final chapter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we present some combinatorial optimization problems, suggest models and algorithms for their effective solution. For each problem,we give its description, followed by a short literature review, provide methods to solve it and, finally, present computational results and comparisons with previous works to show the effectiveness of the proposed approaches. The considered problems are: the Generalized Traveling Salesman Problem (GTSP), the Bin Packing Problem with Conflicts(BPPC) and the Fair Layout Problem (FLOP).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ground-based Earth troposphere calibration systems play an important role in planetary exploration, especially to carry out radio science experiments aimed at the estimation of planetary gravity fields. In these experiments, the main observable is the spacecraft (S/C) range rate, measured from the Doppler shift of an electromagnetic wave transmitted from ground, received by the spacecraft and coherently retransmitted back to ground. If the solar corona and interplanetary plasma noise is already removed from Doppler data, the Earth troposphere remains one of the main error sources in tracking observables. Current Earth media calibration systems at NASA’s Deep Space Network (DSN) stations are based upon a combination of weather data and multidirectional, dual frequency GPS measurements acquired at each station complex. In order to support Cassini’s cruise radio science experiments, a new generation of media calibration systems were developed, driven by the need to achieve the goal of an end-to-end Allan deviation of the radio link in the order of 3×〖10〗^(-15) at 1000 s integration time. The future ESA’s Bepi Colombo mission to Mercury carries scientific instrumentation for radio science experiments (a Ka-band transponder and a three-axis accelerometer) which, in combination with the S/C telecommunication system (a X/X/Ka transponder) will provide the most advanced tracking system ever flown on an interplanetary probe. Current error budget for MORE (Mercury Orbiter Radioscience Experiment) allows the residual uncalibrated troposphere to contribute with a value of 8×〖10〗^(-15) to the two-way Allan deviation at 1000 s integration time. The current standard ESA/ESTRACK calibration system is based on a combination of surface meteorological measurements and mathematical algorithms, capable to reconstruct the Earth troposphere path delay, leaving an uncalibrated component of about 1-2% of the total delay. In order to satisfy the stringent MORE requirements, the short time-scale variations of the Earth troposphere water vapor content must be calibrated at ESA deep space antennas (DSA) with more precise and stable instruments (microwave radiometers). In parallel to this high performance instruments, ESA ground stations should be upgraded to media calibration systems at least capable to calibrate both troposphere path delay components (dry and wet) at sub-centimetre level, in order to reduce S/C navigation uncertainties. The natural choice is to provide a continuous troposphere calibration by processing GNSS data acquired at each complex by dual frequency receivers already installed for station location purposes. The work presented here outlines the troposphere calibration technique to support both Deep Space probe navigation and radio science experiments. After an introduction to deep space tracking techniques, observables and error sources, in Chapter 2 the troposphere path delay is widely investigated, reporting the estimation techniques and the state of the art of the ESA and NASA troposphere calibrations. Chapter 3 deals with an analysis of the status and the performances of the NASA Advanced Media Calibration (AMC) system referred to the Cassini data analysis. Chapter 4 describes the current release of a developed GNSS software (S/W) to estimate the troposphere calibration to be used for ESA S/C navigation purposes. During the development phase of the S/W a test campaign has been undertaken in order to evaluate the S/W performances. A description of the campaign and the main results are reported in Chapter 5. Chapter 6 presents a preliminary analysis of microwave radiometers to be used to support radio science experiments. The analysis has been carried out considering radiometric measurements of the ESA/ESTEC instruments installed in Cabauw (NL) and compared with the requirements of MORE. Finally, Chapter 7 summarizes the results obtained and defines some key technical aspects to be evaluated and taken into account for the development phase of future instrumentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Combinatorial Optimization is becoming ever more crucial, in these days. From natural sciences to economics, passing through urban centers administration and personnel management, methodologies and algorithms with a strong theoretical background and a consolidated real-word effectiveness is more and more requested, in order to find, quickly, good solutions to complex strategical problems. Resource optimization is, nowadays, a fundamental ground for building the basements of successful projects. From the theoretical point of view, Combinatorial Optimization rests on stable and strong foundations, that allow researchers to face ever more challenging problems. However, from the application point of view, it seems that the rate of theoretical developments cannot cope with that enjoyed by modern hardware technologies, especially with reference to the one of processors industry. In this work we propose new parallel algorithms, designed for exploiting the new parallel architectures available on the market. We found that, exposing the inherent parallelism of some resolution techniques (like Dynamic Programming), the computational benefits are remarkable, lowering the execution times by more than an order of magnitude, and allowing to address instances with dimensions not possible before. We approached four Combinatorial Optimization’s notable problems: Packing Problem, Vehicle Routing Problem, Single Source Shortest Path Problem and a Network Design problem. For each of these problems we propose a collection of effective parallel solution algorithms, either for solving the full problem (Guillotine Cuts and SSSPP) or for enhancing a fundamental part of the solution method (VRP and ND). We endorse our claim by presenting computational results for all problems, either on standard benchmarks from the literature or, when possible, on data from real-world applications, where speed-ups of one order of magnitude are usually attained, not uncommonly scaling up to 40 X factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Logistics involves planning, managing, and organizing the flows of goods from the point of origin to the point of destination in order to meet some requirements. Logistics and transportation aspects are very important and represent a relevant costs for producing and shipping companies, but also for public administration and private citizens. The optimization of resources and the improvement in the organization of operations is crucial for all branches of logistics, from the operation management to the transportation. As we will have the chance to see in this work, optimization techniques, models, and algorithms represent important methods to solve the always new and more complex problems arising in different segments of logistics. Many operation management and transportation problems are related to the optimization class of problems called Vehicle Routing Problems (VRPs). In this work, we consider several real-world deterministic and stochastic problems that are included in the wide class of the VRPs, and we solve them by means of exact and heuristic methods. We treat three classes of real-world routing and logistics problems. We deal with one of the most important tactical problems that arises in the managing of the bike sharing systems, that is the Bike sharing Rebalancing Problem (BRP). We propose models and algorithms for real-world earthwork optimization problems. We describe the 3DP process and we highlight several optimization issues in 3DP. Among those, we define the problem related to the tool path definition in the 3DP process, the 3D Routing Problem (3DRP), which is a generalization of the arc routing problem. We present an ILP model and several heuristic algorithms to solve the 3DRP.