950 resultados para Symmetric Even Graphs
Resumo:
Complexes of the type {[(pyS)Ru(NH3)(4)](2)-mu-L}(n), where pyS = 4-mercaptopyridine, L = 4,4'-dithiodipyridine (pySSpy), pyrazine (pz) and 1,4-dicyanobenzene (DCB), and n = +4 and +5 for fully reduced and mixed-valence complexes, respectively, were synthesized and characterized. Electrochemical data showed that there is electron communication between the metal centers with comproportionation constants of 33.2, 1.30 x 10(8) and 5.56 x 10(5) for L = pySSpy, pz and DCB, respectively. It was also observed that the electronic coupling between the metal centers is affected by the p-back-bonding interaction toward the pyS ligand. Raman spectroscopy showed a dependence of the intensity of the vibrational modes on the exciting radiations giving support to the assignments of the electronic transitions. The degree of electron communication between the metal centers through the bridging ligands suggests that these systems can be molecular wire materials.
Resumo:
OBJECTIVE: To verify if there is difference in the buccal and posterior corridor width in cases treated with extraction of one and four premolars. METHODS: Through posed smile photographs of 23 Class II patients, subdivision, treated with extraction of one premolar and 25 Class I and Class II patients, subdivision, treated with extraction of four premolars, the percentage of buccal and posterior corridor width was calculated. The two protocols of extractions were compared regarding the buccal and posterior corridor width by independent t tests. RESULTS: There was no statistically significant difference on the buccal and posterior corridor widths between patients treated with symmetric and asymmetric extraction. CONCLUSION: The buccal and posterior corridor did not differ between the evaluated protocols of extractions.
Resumo:
Although infective endocarditis (IE) has been described in reports dating from the Renaissance, the diagnosis still challenges and the outcome often surprises. In the course of time, diagnostic criteria have been updated and validated to reduce misdiagnosis. Some risk factors and epidemiology have shown dynamic changes since degenerative valvular disease became more predominant in developed countries, and the mean age of the affected population increased. Despite streptococci have been being well known as etiologic agents, some groups, although rare, have been increasingly reported (e.g., Streptococcus milleri.) Intracardiac complications of IE are common and have a worse prognosis, frequently requiring surgical treatment. We report a case of a middle-aged diabetic man who presented with prolonged fever, weight loss, and ultimately severe dyspnea. IE was diagnosed based on a new valvular regurgitation murmur, a positive blood culture for Streptococcus anginosus, an echocardiographic finding of an aortic valve vegetation, fever, and pulmonary thromboembolism. Despite an appropriate antibiotic regimen, the patient died. Autopsy findings showed vegetation attached to a bicuspid aortic valve with an associated septal abscess and left ventricle and aortic root fistula connecting with the pulmonary artery. A large thrombus was adherent to the pulmonary artery trunk and a pulmonary septic thromboemboli were also identified.
Resumo:
The modern GPUs are well suited for intensive computational tasks and massive parallel computation. Sparse matrix multiplication and linear triangular solver are the most important and heavily used kernels in scientific computation, and several challenges in developing a high performance kernel with the two modules is investigated. The main interest it to solve linear systems derived from the elliptic equations with triangular elements. The resulting linear system has a symmetric positive definite matrix. The sparse matrix is stored in the compressed sparse row (CSR) format. It is proposed a CUDA algorithm to execute the matrix vector multiplication using directly the CSR format. A dependence tree algorithm is used to determine which variables the linear triangular solver can determine in parallel. To increase the number of the parallel threads, a coloring graph algorithm is implemented to reorder the mesh numbering in a pre-processing phase. The proposed method is compared with parallel and serial available libraries. The results show that the proposed method improves the computation cost of the matrix vector multiplication. The pre-processing associated with the triangular solver needs to be executed just once in the proposed method. The conjugate gradient method was implemented and showed similar convergence rate for all the compared methods. The proposed method showed significant smaller execution time.
Resumo:
The behavior of composed Web services depends on the results of the invoked services; unexpected behavior of one of the invoked services can threat the correct execution of an entire composition. This paper proposes an event-based approach to black-box testing of Web service compositions based on event sequence graphs, which are extended by facilities to deal not only with service behavior under regular circumstances (i.e., where cooperating services are working as expected) but also with their behavior in undesirable situations (i.e., where cooperating services are not working as expected). Furthermore, the approach can be used independently of artifacts (e.g., Business Process Execution Language) or type of composition (orchestration/choreography). A large case study, based on a commercial Web application, demonstrates the feasibility of the approach and analyzes its characteristics. Test generation and execution are supported by dedicated tools. Especially, the use of an enterprise service bus for test execution is noteworthy and differs from other approaches. The results of the case study encourage to suggest that the new approach has the power to detect faults systematically, performing properly even with complex and large compositions. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Homing endonucleases are rare-cutting enzymes that cleave DNA at a site near their own location, preferentially in alleles lacking the homing endonuclease gene (HEG). By cleaving HEG-less alleles the homing endonuclease can mediate the transfer of its own gene to the cleaved site via a process called homing, involving double strand break repair. Via homing, HEGs are efficiently transferred into new genomes when horizontal exchange of DNA occurs between organisms. Group I introns are intervening sequences that can catalyse their own excision from the unprocessed transcript without the need of any proteins. They are widespread, occurring both in eukaryotes and prokaryotes and in their viruses. Many group I introns encode a HEG within them that confers mobility also to the intron and mediates the combined transfer of the intron/HEG to intronless alleles via homing. Bacteriophage T4 contains three such group I introns and at least 12 freestanding HEGs in its genome. The majority of phages besides T4 do not contain any introns, and freestanding HEGs are also scarcely represented among other phages. In the first paper we looked into why group I introns are so rare in phages related to T4 in spite of the fact that they can spread between phages via homing. We have identified the first phage besides T4 that contains all three T-even introns and also shown that homing of at least one of the introns has occurred recently between some of the phages in Nature. We also show that intron homing can be highly efficient between related phages if two phages infect the same bacterium but that there also exists counteracting mechanisms that can restrict the spread of introns between phages. In the second paper we have looked at how the presence of introns can affect gene expression in the phage. We find that the efficiency of splicing can be affected by variation of translation of the upstream exon for all three introns in T4. Furthermore, we find that splicing is also compromised upon infection of stationary-phase bacteria. This is the first time that the efficiency of self-splicing of group I introns has been coupled to environmental conditions and the potential effect of this on phage viability is discussed. In the third paper we have characterised two novel freestanding homing endonucleases that in some T-even-like phages replace two of the putative HEGs in T4. We also present a new theory on why it is a selective advantage for freestanding, phage homing endonucleases to cleave both HEG-containing and HEG-less genomes.
Resumo:
In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.
Resumo:
Compared with other mature engineering disciplines, fracture mechanics of concrete is still a developing field and very important for structures like bridges subject to dynamic loading. An historical point of view of what done in the field is provided and then the project is presented. The project presents an application of the Digital Image Correlation (DIC) technique for the detection of cracks at the surface of concrete prisms (500mmx100mmx100mm) subject to flexural loading conditions (Four Point Bending test). The technique provide displacement measurements of the region of interest and from this displacement field information about crack mouth opening (CMOD) are obtained and related to the applied load. The evolution of the fracture process is shown through graphs and graphical maps of the displacement at some step of the loading process. The study shows that it is possible with the DIC system to detect the appearance and evolution of cracks, even before the cracks become visually detectable.
Resumo:
Data coming out from various researches carried out over the last years in Italy on the problem of school dispersion in secondary school show that difficulty in studying mathematics is one of the most frequent reasons of discomfort reported by students. Nevertheless, it is definitely unrealistic to think we can do without such knowledge in today society: mathematics is largely taught in secondary school and it is not confined within technical-scientific courses only. It is reasonable to say that, although students may choose academic courses that are, apparently, far away from mathematics, all students will have to come to terms, sooner or later in their life, with this subject. Among the reasons of discomfort given by the study of mathematics, some mention the very nature of this subject and in particular the complex symbolic language through which it is expressed. In fact, mathematics is a multimodal system composed by oral and written verbal texts, symbol expressions, such as formulae and equations, figures and graphs. For this, the study of mathematics represents a real challenge to those who suffer from dyslexia: this is a constitutional condition limiting people performances in relation to the activities of reading and writing and, in particular, to the study of mathematical contents. Here the difficulties in working with verbal and symbolic codes entail, in turn, difficulties in the comprehension of texts from which to deduce operations that, once combined together, would lead to the problem final solution. Information technologies may support this learning disorder effectively. However, these tools have some implementation limits, restricting their use in the study of scientific subjects. Vocal synthesis word processors are currently used to compensate difficulties in reading within the area of classical studies, but they are not used within the area of mathematics. This is because the vocal synthesis (or we should say the screen reader supporting it) is not able to interpret all that is not textual, such as symbols, images and graphs. The DISMATH software, which is the subject of this project, would allow dyslexic users to read technical-scientific documents with the help of a vocal synthesis, to understand the spatial structure of formulae and matrixes, to write documents with a technical-scientific content in a format that is compatible with main scientific editors. The system uses LaTex, a text mathematic language, as mediation system. It is set up as LaTex editor, whose graphic interface, in line with main commercial products, offers some additional specific functions with the capability to support the needs of users who are not able to manage verbal and symbolic codes on their own. LaTex is translated in real time into a standard symbolic language and it is read by vocal synthesis in natural language, in order to increase, through the bimodal representation, the ability to process information. The understanding of the mathematic formula through its reading is made possible by the deconstruction of the formula itself and its “tree” representation, so allowing to identify the logical elements composing it. Users, even without knowing LaTex language, are able to write whatever scientific document they need: in fact the symbolic elements are recalled by proper menus and automatically translated by the software managing the correct syntax. The final aim of the project, therefore, is to implement an editor enabling dyslexic people (but not only them) to manage mathematic formulae effectively, through the integration of different software tools, so allowing a better teacher/learner interaction too.
Resumo:
Vortex dynamics in two different classes of superconductors with anisotropic unidirected pinning sites was experimentally investigated by magnetoresistivity measurements: YBCO−films with unidirected twins and Nb-films deposited on faceted $mathrm Al_2O_3$ substrate surfaces. For the interpretation of the experimental results a theoretical model based on the Fokker-Planck equation was used. It was proved by X-ray measurements that YBCO films prepared on (001) $mathrm NdGaO_3$ substrates exhibit only one twin orientation in contrast to YBCO films grown on (100) $mathrm SrTiO_$3 substrates. The magnetoresistivity measurements of the YBCO films with unidirected twin boundaries revealed the existence of two new magnetoresistivity components, which is a characteristic feature of a guided vortex motion: an odd longitudinal component with respect to the magnetic field sign reversal and an even transversal component. However, due to the small coherence length in YBCO and the higher density of point-like defects comparing to high-quality YBCO single crystals, the strength of the isotropic point pinning was comparable with the strength of the pinning produced by twins. This smeared out all effects caused by the pinning anisotropy. The behaviour of the odd longitudinal component was found to be independent of the transport current direction with respect to the twin planes. The magnetoresistivity measurements of faceted Nb films demonstrated the appearance of an odd longitudinal and even transversal component of the magnetoresistivity. The temperature and magnetic field dependences of all relevant magnetoresistivity components were measured. The angles between the average vortex velocity vector and the transport current direction calculated from the experimental data for the different transport current orientations with respect to the facet ridges showed that the vortices moved indeed along the facet ridges. An anomalous Hall effect, i.e. a sign change of the odd transversal magnetoresistivity, has been found in the temperature and magnetic field dependences of the Hall resisitivity of the samples. The theory developed by V.~A.~Shklovskij was used for the explanation of the experimental data. It shows very good agreement with the experiment. The temperature dependence of the even longitudinal magnetoresistivity component of the samples could be very well fitted within the theoretical approach, using for the isotropic and anisotropic pinning potential simple potential with a symmetric triangular potential wells whose depths were estimated from the experimental data.
Resumo:
Diese Arbeit beschäftigt sich mit Strukturbildung im schlechten Lösungsmittel bei ein- und zweikomponentigen Polymerbürsten, bei denen Polymerketten durch Pfropfung am Substrat verankert sind. Solche Systeme zeigen laterale Strukturbildungen, aus denen sich interessante Anwendungen ergeben. Die Bewegung der Polymere erfolgt durch Monte Carlo-Simulationen im Kontinuum, die auf CBMC-Algorithmen sowie lokalen Monomerverschiebungen basieren. Eine neu entwickelte Variante des CBMC-Algorithmus erlaubt die Bewegung innerer Kettenteile, da der bisherige Algorithmus die Monomere in Nähe des Pfropfmonomers nicht gut relaxiert. Zur Untersuchung des Phasenverhaltens werden mehrere Analysemethoden entwickelt und angepasst: Dazu gehören die Minkowski-Maße zur Strukturuntersuchung binären Bürsten und die Pfropfkorrelationen zur Untersuchung des Einflusses von Pfropfmustern. Bei einkomponentigen Bürsten tritt die Strukturbildung nur beim schwach gepfropften System auf, dichte Pfropfungen führen zu geschlossenen Bürsten ohne laterale Struktur. Für den graduellen Übergang zwischen geschlossener und aufgerissener Bürste wird ein Temperaturbereich bestimmt, in dem der Übergang stattfindet. Der Einfluss des Pfropfmusters (Störung der Ausbildung einer langreichweitigen Ordnung) auf die Bürstenkonfiguration wird mit den Pfropfkorrelationen ausgewertet. Bei unregelmäßiger Pfropfung sind die gebildeten Strukturen größer als bei regelmäßiger Pfropfung und auch stabiler gegen höhere Temperaturen. Bei binären Systemen bilden sich Strukturen auch bei dichter Pfropfung aus. Zu den Parametern Temperatur, Pfropfdichte und Pfropfmuster kommt die Zusammensetzung der beiden Komponenten hinzu. So sind weitere Strukturen möglich, bei gleicher Häufigkeit der beiden Komponenten bilden sich streifenförmige, lamellare Muster, bei ungleicher Häufigkeit formt die Minoritätskomponente Cluster, die in der Majoritätskomponente eingebettet sind. Selbst bei gleichmäßig gepfropften Systemen bildet sich keine langreichweitige Ordnung aus. Auch bei binären Bürsten hat das Pfropfmuster großen Einfluss auf die Strukturbildung. Unregelmäßige Pfropfmuster führen schon bei höheren Temperaturen zur Trennung der Komponenten, die gebildeten Strukturen sind aber ungleichmäßiger und etwas größer als bei gleichmäßig gepfropften Systemen. Im Gegensatz zur self consistent field-Theorie berücksichtigen die Simulationen Fluktuationen in der Pfropfung und zeigen daher bessere Übereinstimmungen mit dem Experiment.