32 resultados para Network Analysis Methods
Resumo:
The TCP/IP architecture was consolidated as a standard to the distributed systems. However, there are several researches and discussions about alternatives to the evolution of this architecture and, in this study area, this work presents the Title Model to contribute with the application needs support by the cross layer ontology use and the horizontal addressing, in a next generation Internet. For a practical viewpoint, is showed the network cost reduction for the distributed programming example, in networks with layer 2 connectivity. To prove the title model enhancement, it is presented the network analysis performed for the message passing interface, sending a vector of integers and returning its sum. By this analysis, it is confirmed that the current proposal allows, in this environment, a reduction of 15,23% over the total network traffic, in bytes.
Resumo:
For the past half a century, Latin American scholars have been pointing toward the emergence of new social actors as agents of social and political democratization. The first wave of actors was characterized by the emergence of novel agents-mainly, new popular movements-of social transformation. At first, the second wave, epitomized by nongovernmental organizations (NGOs), was celebrated as the upsurge of a new civil society, but later on, it was the target of harsh criticism. The literature often portrays this development in Latin American civil society as a displacement trend of actors of the first wave by the second wave-""NGOization""-""and even denounces new civil society as rootless, depoliticized, and functional to retrenchment. Thus, supposedly, NGOization encumbers social change. The authors argue that NGOization diagnosis is a flawed depiction of change within civil society. Rather than NGOization related to the depoliticization and neoliberalization of civil society, in Mexico City and Sao Paulo, there has been modernization of organizational ecologies, changes in the functional status of civil society, and interestingly, specialization aimed at shaping public agenda. The authors argue that such specialization, instead of encumbering social change, brings about different repertoires of strategies and skills purposively developed for influencing policy and politics. Their argument relies on comparative systematic evidence. Through network analysis, they examine the organizational ecology of civil society in Mexico City and Sao Paulo.
Resumo:
Introduction: Perineural invasion is a well-recognized form of cancer dissemination. However, it has been reported only in few papers concerning cutaneous carcinomas ( basal cell, BCC, and squamous cell, SCC). Moreover, the incidence is considered to be very low. Niazi and Lambert [Br J Plast Surg 1993; 46: 156-157] reported only 0.18% of perineural invasion among 3,355 BCCs. It is associated with high-risk subtypes, as morphea-like, as well as with an increased risk of local recurrence. No paper was found in the literature looking for perineural invasion in very aggressive skin cancers with skull base extension, with immunohistochemical analysis. Methods: This is a retrospective review, including 35 very advanced skin carcinomas with skull base invasion (24 BCCs and 11 SCCs, operated on at a single institution from 1982 to 2000). Representative slides were immunohistochemically evaluated with antiprotein S-100, in order to enhance nerve fibers and to detect perineural invasion. The results were compared to 34 controls with tumors with a good outcome, treated in the same time frame at the same Institution. Results: Twelve (50.0%) of the BCCs with skull base invasion had proven perineural invasion, as opposed to only 1 (4.6%) of the controls, and this difference was statistically significant (p < 0.001). Regarding SCCs, 7 aggressive tumors (63.6%) showed perineural invasion compared to only 1 (10.0%) of the controls, but this difference did not reach significance (p=0.08), due to the small number of cases. Conclusions: In this series, it was demonstrated that immunohistochemically detected perineural invasion was very prevalent in advanced skin carcinomas. In addition, it was statistically associated with extremely aggressive BCCs with skull base invasion. Copyright (c) 2008 S. Karger AG, Basel
Resumo:
Stimulating neural electrodes are required to deliver charge to an environment that presents itself as hostile. The electrodes need to maintain their electrical characteristics (charge and impedance) in vivo for a proper functioning of neural prostheses. Here we design implantable multi-walled carbon nanotubes coating for stainless steel substrate electrodes, targeted at wide frequency stimulation of deep brain structures. In well-controlled, low-frequency stimulation acute experiments, we show that multi-walled carbon nanotube electrodes maintain their charge storage capacity (CSC) and impedance in vivo. The difference in average CSCs (n = 4) between the in vivo (1.111 mC cm(-2)) and in vitro (1.008 mC cm(-2)) model was statistically insignificant (p > 0.05 or P-value = 0.715, two tailed). We also report on the transcription levels of the pro-inflammatory cytokine IL-1 beta and TLR2 receptor as an immediate response to low-frequency stimulation using RT-PCR. We show here that the IL-1 beta is part of the inflammatory response to low-frequency stimulation, but TLR2 is not significantly increased in stimulated tissue when compared to controls. The early stages of neuroinflammation due to mechanical and electrical trauma induced by implants can be better understood by detection of pro-inflammatory molecules rather than by histological studies. Tracking of such quantitative response profits from better analysis methods over several temporal and spatial scales. Our results concerning the evaluation of such inflammatory molecules revealed that transcripts for the cytokine IL-1 beta are upregulated in response to low-frequency stimulation, whereas no modulation was observed for TLR2. This result indicates that the early response of the brain to mechanical trauma and low-frequency stimulation activates the IL-1 beta signaling cascade but not that of TLR2.
Resumo:
Objectives The purpose of the present work was to characterize file pharmacological profile of different L. alba chemotypes and to correlate the obtained data to the presence of chemical constituents detected by phytochemical analysis. Methods Essential oils from each L. alba chemotype (LP1-LP7) were characterized by gas chromatography-mass spectrometry (GC-MS) and extracted non-volatile compounds were analysed by HPLC and GC-MS. The anticonvulsant actions of file extracted compounds were studied in pentylenetetrazole-induced clonic seizures in mice and then effect oil motor coordination was studied using the rota-rod test in rats. The synaptosomes and synaptic membranes of the rats were examined for the influence of LP3 chemotype extract oil GABA uptake and binding experiments. Key findings Behavioural parameters encompassed by the pentylenetetrazole test indicated that 80% ethanolic extracts of LP1, LP3 and LP6 L. alba chemotypes were more effective as anticonvulsant agents. Neurochemical assays using synaptosomes and synaptic membranes showed that L. alba LP3 chemotype 80% ethanolic extract inhibited GABA uptake and GABA binding ill a dose-dependent manner. HPLC analysis showed that LP1, LP3 and LP6 80% ethanolic extracts presented a similar profile of constituents, differing from those seen in LP2, LP4, LP5 and LP7 80% ethanolic extracts, which exhibited no anticonvulsant effect. GC-MS analysis indicated the Occurrence of phenylpropanoids in methanolic fractions obtained from LP1, LP3 and LP6 80% ethanolic extracts and also the accumulation of inositol and flavonoids in hydroalcoholic fractions. Conclusions Our results suggest that the anticonvulsant properties shown by L. alba might be correlated to the presence of it complex of non-volatile Substances (phenylpropanoids, flavonoids and/or inositols), and also to the volatile terpenoids (beta-myrcene, citral, limonene and carvone), which have been previously Validated as anticonvulsants.
Resumo:
Introduction: The aim of this study was to evaluate the root canal preparation in flat-oval canals treated with either rotary or self-adjusting file (SAF) by using micro-tomography analysis. Methods: Forty mandibular incisors were scanned before and after root canal instrumentation with rotary instruments (n = 20) or SAF (n = 20). Changes in canal volume, surface area, and cross-sectional geometry were compared with preoperative values. Data were compared by independent sample t test and chi(2) test between groups and paired sample t test within the group (alpha = 0.05). Results: Overall, area, perimeter, roundness, and major and minor diameters revealed no statistical difference between groups (P > .05). In the coronal third, percentage of prepared root canal walls and mean increases of volume and area were significantly higher with SAF (92.0%, 1.44 +/- 0.49 mm(3), 0.40 +/- 0.14 mm(2), respectively) than rotary instrumentation (62.0%, 0.81 +/- 0.45 mm(3), 0.23 +/- 0.15 mm2, respectively) (P < .05). SAF removed dentin layer from all around the canal, whereas rotary instrumentation showed substantial untouched areas. Conclusions: In the coronal third, mean increases of area and volume of the canal as well as the percentage of prepared walls were significantly higher with SAF than with rotary instrumentation. By using SAF instruments, flat-oval canals were homogenously and circumferentially prepared. The size of the SAF preparation in the apical third of the canal was equivalent to those prepared with #40 rotary file with a 0.02 taper. (J Endod 2011;37:1002-1007)
Resumo:
The dynamical processes that lead to open cluster disruption cause its mass to decrease. To investigate such processes from the observational point of view, it is important to identify open cluster remnants (OCRs), which are intrinsically poorly populated. Due to their nature, distinguishing them from field star fluctuations is still an unresolved issue. In this work, we developed a statistical diagnostic tool to distinguish poorly populated star concentrations from background field fluctuations. We use 2MASS photometry to explore one of the conditions required for a stellar group to be a physical group: to produce distinct sequences in a colour-magnitude diagram (CMD). We use automated tools to (i) derive the limiting radius; (ii) decontaminate the field and assign membership probabilities; (iii) fit isochrones; and (iv) compare object and field CMDs, considering the isochrone solution, in order to verify the similarity. If the object cannot be statistically considered as a field fluctuation, we derive its probable age, distance modulus, reddening and uncertainties in a self-consistent way. As a test, we apply the tool to open clusters and comparison fields. Finally, we study the OCR candidates DoDz 6, NGC 272, ESO 435 SC48 and ESO 325 SC15. The tool is optimized to treat these low-statistic objects and to separate the best OCR candidates for studies on kinematics and chemical composition. The study of the possible OCRs will certainly provide a deep understanding of OCR properties and constraints for theoretical models, including insights into the evolution of open clusters and dissolution rates.
Resumo:
Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.
Resumo:
Introduction: The characterization of microbial communities infecting the endodontic system in each clinical condition may help on the establishment of a correct prognosis and distinct strategies of treatment. The purpose of this study was to determine the bacterial diversity in primary endodontic infections by 16S ribosomal-RNA (rRNA) sequence analysis. Methods: Samples from root canals of untreated asymptomatic teeth (n = 12) exhibiting periapical lesions were obtained, 165 rRNA bacterial genomic libraries were constructed and sequenced, and bacterial diversity was estimated. Results: A total of 489 clones were analyzed (mean, 40.7 +/- 8.0 clones per sample). Seventy phylotypes were identified of which six were novel phylotypes belonging to the family Ruminococcaceae. The mean number of taxa per canal was 10.0, ranging from 3 to 21 per sample; 65.7% of the cloned sequences represented phylotypes for which no cultivated isolates have been reported. The most prevalent taxa were Atopobium rimae (50.0%), Dialister invisus, Pre-votella oris, Pseudoramibacter alactolyticus, and Tannerella forsythia (33.3%). Conclusions: Although several key species predominate in endodontic samples of asymptomatic cases with periapical lesions, the primary endodontic infection is characterized by a wide bacterial diversity, which is mostly represented by members of the phylum Firmicutes belonging to the class Clostridia followed by the phylum Bacteroidetes. (J Ended 2011;37:922-926)
Resumo:
Sociable robots are embodied agents that are part of a heterogeneous society of robots and humans. They Should be able to recognize human beings and each other, and to engage in social, interactions. The use of a robotic architecture may strongly reduce the time and effort required to construct a sociable robot. Such architecture must have structures and mechanisms to allow social interaction. behavior control and learning from environment. Learning processes described oil Science of Behavior Analysis may lead to the development of promising methods and Structures for constructing robots able to behave socially and learn through interactions from the environment by a process of contingency learning. In this paper, we present a robotic architecture inspired from Behavior Analysis. Methods and structures of the proposed architecture, including a hybrid knowledge representation. are presented and discussed. The architecture has been evaluated in the context of a nontrivial real problem: the learning of the shared attention, employing an interactive robotic head. The learning capabilities of this architecture have been analyzed by observing the robot interacting with the human and the environment. The obtained results show that the robotic architecture is able to produce appropriate behavior and to learn from social interaction. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
In a recent paper, the hydrodynamic code NEXSPheRIO was used in conjunction with STAR analysis methods to study two-particle correlations as a function of Delta(eta) and Delta phi. The various structures observed in the data were reproduced. In this work, we discuss the origin of these structures as well as present new results.
Resumo:
Estimating the sizes of hard-to-count populations is a challenging and important problem that occurs frequently in social science, public health, and public policy. This problem is particularly pressing in HIV/AIDS research because estimates of the sizes of the most at-risk populations-illicit drug users, men who have sex with men, and sex workers-are needed for designing, evaluating, and funding programs to curb the spread of the disease. A promising new approach in this area is the network scale-up method, which uses information about the personal networks of respondents to make population size estimates. However, if the target population has low social visibility, as is likely to be the case in HIV/AIDS research, scale-up estimates will be too low. In this paper we develop a game-like activity that we call the game of contacts in order to estimate the social visibility of groups, and report results from a study of heavy drug users in Curitiba, Brazil (n = 294). The game produced estimates of social visibility that were consistent with qualitative expectations but of surprising magnitude. Further, a number of checks suggest that the data are high-quality. While motivated by the specific problem of population size estimation, our method could be used by researchers more broadly and adds to long-standing efforts to combine the richness of social network analysis with the power and scale of sample surveys. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Nowadays, noninvasive methods of diagnosis have increased due to demands of the population that requires fast, simple and painless exams. These methods have become possible because of the growth of technology that provides the necessary means of collecting and processing signals. New methods of analysis have been developed to understand the complexity of voice signals, such as nonlinear dynamics aiming at the exploration of voice signals dynamic nature. The purpose of this paper is to characterize healthy and pathological voice signals with the aid of relative entropy measures. Phase space reconstruction technique is also used as a way to select interesting regions of the signals. Three groups of samples were used, one from healthy individuals and the other two from people with nodule in the vocal fold and Reinke`s edema. All of them are recordings of sustained vowel /a/ from Brazilian Portuguese. The paper shows that nonlinear dynamical methods seem to be a suitable technique for voice signal analysis, due to the chaotic component of the human voice. Relative entropy is well suited due to its sensibility to uncertainties, since the pathologies are characterized by an increase in the signal complexity and unpredictability. The results showed that the pathological groups had higher entropy values in accordance with other vocal acoustic parameters presented. This suggests that these techniques may improve and complement the recent voice analysis methods available for clinicians. (C) 2008 Elsevier Inc. All rights reserved.