967 resultados para two-round scheme
Resumo:
A conceptually new approach is introduced for the decomposition of the molecular energy calculated at the density functional theory level of theory into sum of one- and two-atomic energy components, and is realized in the "fuzzy atoms" framework. (Fuzzy atoms mean that the three-dimensional physical space is divided into atomic regions having no sharp boundaries but exhibiting a continuous transition from one to another.) The new scheme uses the new concept of "bond order density" to calculate the diatomic exchange energy components and gives them unexpectedly close to the values calculated by the exact (Hartree-Fock) exchange for the same Kohn-Sham orbitals
Resumo:
In a distributed key distribution scheme, a set of servers helps a set of users in a group to securely obtain a common key. Security means that an adversary who corrupts some servers and some users has no information about the key of a noncorrupted group. In this work, we formalize the security analysis of one such scheme which was not considered in the original proposal. We prove the scheme is secure in the random oracle model, assuming that the Decisional Diffie-Hellman (DDH) problem is hard to solve. We also detail a possible modification of that scheme and the one in which allows us to prove the security of the schemes without assuming that a specific hash function behaves as a random oracle. As usual, this improvement in the security of the schemes is at the cost of an efficiency loss.
Resumo:
This study aimed to evaluate the effects of (g a.i. L-1) abamectin (0.02), carbaryl (1.73), sulphur (4.8), fenitrothion (0.75), methidathion (0.4), and trichlorfon (1.5) on the survival of larvae and pupae, on the oviposition of adults and hatching of eggs from treated Chrysoperla externa third-instar larvae from two different populations (Bento Gonçalves and Vacaria, Rio Grande do Sul State, Brazil). Morphological changes caused by abamectin to eggs laid by C. externa from Vacaria population were evaluated by mean of ultrastructural analysis. The pesticides were applied on glass plates. Distilled water was used as control. For the evaluation of larvae mortality, a fully randomized experimental design in a 2 x 7 (two populations x seven treatments) factorial scheme was used, whereas for the effects of the compounds on oviposition capacity and egg viability, a 2 x 4 factorial scheme was used. Carbaryl, fenitrothion, and methidathion caused 100% mortality of larvae. Abamectin reduced the hatching of eggs from treated third-instar larvae of both populations; however, this pesticide presented highest toxicity on insects from Vacaria. The ultrastructural analysis showed that abamectin caused malformations in micropyle and in chorion external surface of C. externa eggs. Based in the total effect (E), carbaryl, fenitrothion, and methidathion are harmful to C. externa; trichlorfon is harmless to third-instar larvae, while abamectin and sulphur are harmless and slightly harmful to third-instar larvae from Bento Gonçalves and Vacaria, respectively.
Resumo:
A major constraint to agricultural production in acid soils of tropical regions is the low soil P availability, due to the high adsorption capacity, low P level in the source material and low efficiency of P uptake and use by most of the modern varieties grown commercially. This study was carried out to evaluate the biomass production and P use by forage grasses on two soils fertilized with two P sources of different solubility. Two experiments were carried out, one for each soil (Cambisol and Latosol), using pots filled with 4 dm³ soil in a completely randomized design and a 4 x 2 factorial scheme. The treatments consisted of a combination of four forage plants (Brachiaria decumbens, Brachiaria brizantha, Pennisetum glaucum and Sorghum bicolor) with two P sources (Triple Superphosphate - TSP and Arad Reactive Phosphate - ARP), with four replications. The forage grasses were harvested at pre-flowering, when dry matter weight and P concentrations were measured. Based on the P concentration and dry matter production, the total P accumulation was calculated. With these data, the following indices were calculated: the P uptake efficiency of roots, P use efficiency, use efficiency of available P, use efficiency of applied P and agronomic efficiency. The use of the source with higher solubility (TSP) resulted, generally, in higher total dry matter and total P accumulation in the forage grasses, in both soils. For the less reactive source (ARP), the means found in the forage grasses, for use efficiency and efficient use of available P, were always higher when grown in Latosol, indicating favorable conditions for the solubility of ARP. The total dry matter of Brachiaria brizantha was generally higher, with low P uptake, accumulation and translocation, which indicated good P use efficiency for both P sources and soils. The forage plants differed in the P use potential, due to the sources of the applied P and of the soils used. Less than 10 % of the applied P was immobilized in the forage dry matter. Highest values were observed for TSP, but this was not reflected in a higher use efficiency of P from this source.
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan)
Resumo:
OBJECTIVE: The purpose of the present study was to submit the same materials that were tested in the round robin wear test of 2002/2003 to the Alabama wear method. METHODS: Nine restorative materials, seven composites (belleGlass, Chromasit, Estenia, Heliomolar, SureFil, Targis, Tetric Ceram) an amalgam (Amalcap) and a ceramic (IPS Empress) have been submitted to the Alabama wear method for localized and generalized wear. The test centre did not know which brand they were testing. Both volumetric and vertical loss had been determined with an optical sensor. After completion of the wear test, the raw data were sent to IVOCLAR for further analysis. The statistical analysis of the data included logarithmic transformation of the data, the calculation of relative ranks of each material within each test centre, measures of agreement between methods, the discrimination power and coefficient of variation of each method as well as measures of the consistency and global performance for each material. RESULTS: Relative ranks of the materials varied tremendously between the test centres. When all materials were taken into account and the test methods compared with each other, only ACTA agreed reasonably well with two other methods, i.e. OHSU and ZURICH. On the other hand, MUNICH did not agree with the other methods at all. The ZURICH method showed the lowest discrimination power, ACTA, IVOCLAR and ALABAMA localized the highest. Material-wise, the best global performance was achieved by the leucite reinforced ceramic material Empress, which was clearly ahead of belleGlass, SureFil and Estenia. In contrast, Heliomolar, Tetric Ceram and especially Chromasit demonstrated a poor global performance. The best consistency was achieved by SureFil, Tetric Ceram and Chromasit, whereas the consistency of Amalcap and Heliomolar was poor. When comparing the laboratory data with clinical data, a significant agreement was found for the IVOCLAR and ALABAMA generalized wear method. SIGNIFICANCE: As the different wear simulator settings measure different wear mechanisms, it seems reasonable to combine at least two different wear settings to assess the wear resistance of a new material.
Resumo:
Outgoing radiation is introduced in the framework of the classical predictive electrodynamics using LorentzDiracs equation as a subsidiary condition. In a perturbative scheme in the charges the first radiative self-terms of the accelerations, momentum and angular momentum of a two charge system without external field are calculated.
Resumo:
Following the success of the first round table in 2001, the Swiss Proteomic Society has organized two additional specific events during its last two meetings: a proteomic application exercise in 2002 and a round table in 2003. Such events have as their main objective to bring together, around a challenging topic in mass spectrometry, two groups of specialists, those who develop and commercialize mass spectrometry equipment and software, and expert MS users for peptidomics and proteomics studies. The first round table (Geneva, 2001) entitled "Challenges in Mass Spectrometry" was supported by brief oral presentations that stressed critical questions in the field of MS development or applications (Stöcklin and Binz, Proteomics 2002, 2, 825-827). Topics such as (i) direct analysis of complex biological samples, (ii) status and perspectives for MS investigations of noncovalent peptide-ligant interactions; (iii) is it more appropriate to have complementary instruments rather than a universal equipment, (iv) standardization and improvement of the MS signals for protein identification, (v) what would be the new generation of equipment and finally (vi) how to keep hardware and software adapted to MS up-to-date and accessible to all. For the SPS'02 meeting (Lausanne, 2002), a full session alternative event "Proteomic Application Exercise" was proposed. Two different samples were prepared and sent to the different participants: 100 micro g of snake venom (a complex mixture of peptides and proteins) and 10-20 micro g of almost pure recombinant polypeptide derived from the shrimp Penaeus vannamei carrying an heterogeneous post-translational modification (PTM). Among the 15 participants that received the samples blind, eight returned results and most of them were asked to present their results emphasizing the strategy, the manpower and the instrumentation used during the congress (Binz et. al., Proteomics 2003, 3, 1562-1566). It appeared that for the snake venom extract, the quality of the results was not particularly dependant on the strategy used, as all approaches allowed Lication of identification of a certain number of protein families. The genus of the snake was identified in most cases, but the species was ambiguous. Surprisingly, the precise identification of the recombinant almost pure polypeptides appeared to be much more complicated than expected as only one group reported the full sequence. Finally the SPS'03 meeting reported here included a round table on the difficult and challenging task of "Quantification by Mass Spectrometry", a discussion sustained by four selected oral presentations on the use of stable isotopes, electrospray ionization versus matrix-assisted laser desorption/ionization approaches to quantify peptides and proteins in biological fluids, the handling of differential two-dimensional liquid chromatography tandem mass spectrometry data resulting from high throughput experiments, and the quantitative analysis of PTMs. During these three events at the SPS meetings, the impressive quality and quantity of exchanges between the developers and providers of mass spectrometry equipment and software, expert users and the audience, were a key element for the success of these fruitful events and will have definitively paved the way for future round tables and challenging exercises at SPS meetings.
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).
Resumo:
The impact of round-the-clock cerebrospinal fluid (CSF) Gram stain on overnight empirical therapy for suspected central nervous system (CNS) infections was investigated. All consecutive overnight CSF Gram stains between 2006 and 2011 were included. The impact of a positive or a negative test on empirical therapy was evaluated and compared to other clinical and biological indications based on institutional guidelines. Bacterial CNS infection was documented in 51/241 suspected cases. Overnight CSF Gram stain was positive in 24/51. Upon validation, there were two false-positive and one false-negative results. The sensitivity and specificity were 41 and 99 %, respectively. All patients but one had other indications for empirical therapy than Gram stain alone. Upon obtaining the Gram result, empirical therapy was modified in 7/24, including the addition of an appropriate agent (1), addition of unnecessary agents (3) and simplification of unnecessary combination therapy (3/11). Among 74 cases with a negative CSF Gram stain and without formal indication for empirical therapy, antibiotics were withheld in only 29. Round-the-clock CSF Gram stain had a low impact on overnight empirical therapy for suspected CNS infections and was associated with several misinterpretation errors. Clinicians showed little confidence in CSF direct examination for simplifying or withholding therapy before definite microbiological results.
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
Modern sophisticated telecommunication devices require even more and more comprehensive testing to ensure quality. The test case amount to ensure well enough coverage of testing has increased rapidly and this increased demand cannot be fulfilled anymore only by using manual testing. Also new agile development models require execution of all test cases with every iteration. This has lead manufactures to use test automation more than ever to achieve adequate testing coverage and quality. This thesis is separated into three parts. Evolution of cellular networks is presented at the beginning of the first part. Also software testing, test automation and the influence of development model for testing are examined in the first part. The second part describes a process which was used to implement test automation scheme for functional testing of LTE core network MME element. In implementation of the test automation scheme agile development models and Robot Framework test automation tool were used. In the third part two alternative models are presented for integrating this test automation scheme as part of a continuous integration process. As a result, the test automation scheme for functional testing was implemented. Almost all new functional level testing test cases can now be automated with this scheme. In addition, two models for integrating this scheme to be part of a wider continuous integration pipe were introduced. Also shift from usage of a traditional waterfall model to a new agile development based model in testing stated to be successful.
Resumo:
Round timber has great use in civil construction, performing the function of beams, columns, foundations, poles for power distribution among others, with the advantage of not being processed, such as lumber. The structural design of round timber requires determining the elastic properties, mainly the modulus of elasticity. The Brazilian standards responsible for the stiffness and strength determination of round timber are in effect for over twenty years with no technical review. Round timber, for generally present an axis with non-zero curvature according to the position of the element in the bending test, may exhibit different values of modulus of elasticity. This study aims to analyze the position effect of Eucalyptus grandis round timber on the flexural modulus of elasticity. The three-point bending test was evaluated in two different positions based on the longitudinal rotation of the round timber element. The results revealed that at least two different positions of the round timber element are desired to obtain significant modulus of elasticity.
Resumo:
This paper describes the use of a panel of antibodies (CD117, CD3, CD79a, CD45, cytokeratin, vimentin and E-cadherin) on formalin-fixed, paraffin-embedded sections of canine cutaneous round cell tumours. Neoplastic tumours were diagnosed by histology and histochemical stains and included 107 mast cell tumours, 31 cutaneous histiocytomas, two localized histiocytic sarcomas, 21 cutaneous lymphomas, three plasma cell tumours, one transmissible venereal tumour and seven unclassified round cell tumours. The histologic diagnosis was modified in 39.5% of the total 172 neoplasms. The staining for CD45 and Ecadherin were variable, and therefore, the final diagnoses of cutaneous histiocytoma and localized histiocytic sarcoma were made based on histology in association with negative results for CD3, CD79a, CD117 and cytokeratin. The cellular origin of unclassified round cell tumours was defined in all cases. Cutaneous B-cell lymphoma and plasma cell tumours were CD79a-positive and could be distinguished from each other by the morphological characteristics. Mast cell tumours and T cell lymphoma were CD117 and CD3 positive, respectively. The positive staining for vimentin and the negative staining for CD3, CD79a, CD117 and cytokeratin favoured the diagnosis of transmissible venereal tumours. Thus, the final diagnosis of cutaneous round cell tumours should be based on the interpretation of immunohistochemical results together with the cellular morphology observed by histology. Therefore, more studies to optimize the specific markers in formalin-fixed, paraffinembedded tissues (especially for histiocytes) are required for definitive diagnosis of round cell tumours in dogs.
Resumo:
Both atom localization and Raman cooling, considered in the thesis, reflect recent progress in the area of all-optical methods. We focus on twodimensional (2D) case, using a four-level tripod-type atomic scheme for atom localization within the optical half-wavelength as well as for efficient subrecoil Raman cooling. In the first part, we discuss the principles of 1D atom localization, accompanying by an example of the measurement of a spontaneously-emitted photon. Modifying this example, one archives sub-wavelength localization of a three-level -type atom, measuring the population in its upper state. We go further and obtain 2D sub-wavelength localization for a four-level tripod-type atom. The upper-state population is classified according to the spatial distribution, which in turn forms such structures as spikes, craters and waves. The second part of the thesis is devoted to Raman cooling. The cooling process is controlled by a sequence of velocity-selective transfers from one to another ground state. So far, 1D deep subrecoil cooling has been carried out with the sequence of square or Blackman pulses, applied to -type atoms. In turn, we discuss the transfer of atoms by stimulated Raman adiabatic passage (STIRAP), which provides robustness against the pulse duration if the cooling time is not in any critical role. A tripod-type atomic scheme is used for the purpose of 2D Raman cooling, allowing one to increase the efficiency and simplify the realization of the cooling.