958 resultados para Graph Query
Resumo:
Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio
Resumo:
Most network operators have considered reducing Label Switched Routers (LSR) label spaces (i.e. the number of labels that can be used) as a means of simplifying management of underlaying Virtual Private Networks (VPNs) and, hence, reducing operational expenditure (OPEX). This letter discusses the problem of reducing the label spaces in Multiprotocol Label Switched (MPLS) networks using label merging - better known as MultiPoint-to-Point (MP2P) connections. Because of its origins in IP, MP2P connections have been considered to have tree- shapes with Label Switched Paths (LSP) as branches. Due to this fact, previous works by many authors affirm that the problem of minimizing the label space using MP2P in MPLS - the Merging Problem - cannot be solved optimally with a polynomial algorithm (NP-complete), since it involves a hard- decision problem. However, in this letter, the Merging Problem is analyzed, from the perspective of MPLS, and it is deduced that tree-shapes in MP2P connections are irrelevant. By overriding this tree-shape consideration, it is possible to perform label merging in polynomial time. Based on how MPLS signaling works, this letter proposes an algorithm to compute the minimum number of labels using label merging: the Full Label Merging algorithm. As conclusion, we reclassify the Merging Problem as Polynomial-solvable, instead of NP-complete. In addition, simulation experiments confirm that without the tree-branch selection problem, more labels can be reduced
Resumo:
Fault location has been studied deeply for transmission lines due to its importance in power systems. Nowadays the problem of fault location on distribution systems is receiving special attention mainly because of the power quality regulations. In this context, this paper presents an application software developed in Matlabtrade that automatically calculates the location of a fault in a distribution power system, starting from voltages and currents measured at the line terminal and the model of the distribution power system data. The application is based on a N-ary tree structure, which is suitable to be used in this application due to the highly branched and the non- homogeneity nature of the distribution systems, and has been developed for single-phase, two-phase, two-phase-to-ground, and three-phase faults. The implemented application is tested by using fault data in a real electrical distribution power system
Resumo:
Desenvolupament una aplicació informàtica basada en un sistema de visió per computador, la qual permeti donar una resposta en forma d'informació a partir d'una query d'una imatge que conté una escena o objecte en concret de manera que permeti reconèixer els objectes que apareixen en una imatge per llavors donar informació referent al contingut de la imatge a l’usuari que ha fet la consulta. Resumint, es tracta d’analitzar, dissenyar i construir un sistemade visió per computador capaç de reconèixer objectes d’interès en imatges
Resumo:
Cross-reactivity of plant foods is an important phenomenon in allergy, with geographical variations with respect to the number and prevalence of the allergens involved in this process, whose complexity requires detailed studies. We have addressed the role of thaumatin-like proteins (TLPs) in cross-reactivity between fruit and pollen allergies. A representative panel of 16 purified TLPs was printed onto an allergen microarray. The proteins selected belonged to the sources most frequently associated with peach allergy in representative regions of Spain. Sera from two groups of well characterized patients, one with allergy to Rosaceae fruit (FAG) and another against pollens but tolerant to food-plant allergens (PAG), were obtained from seven geographical areas with different environmental pollen profiles. Cross-reactivity between members of this family was demonstrated by inhibition assays. Only 6 out of 16 purified TLPs showed noticeable allergenic activity in the studied populations. Pru p 2.0201, the peach TLP (41%), chestnut TLP (24%) and plane pollen TLP (22%) proved to be allergens of probable relevance to fruit allergy, being mainly associated with pollen sensitization, and strongly linked to specific geographical areas such as Barcelona, Bilbao, the Canary Islands and Madrid. The patients exhibited >50% positive response to Pru p 2.0201 and to chestnut TLP in these specific areas. Therefore, their recognition patterns were associated with the geographical area, suggesting a role for pollen in the sensitization of these allergens. Finally, the co-sensitizations of patients considering pairs of TLP allergens were analyzed by using the co-sensitization graph associated with an allergen microarray immunoassay. Our data indicate that TLPs are significant allergens in plant food allergy and should be considered when diagnosing and treating pollen-food allergy.
Resumo:
The Andalusian Public Health System (SSPA) is considering the last time an attempt urgent process management through triage consultation, both hospital and primary care environment and tables situations in which the nurse responsible for these consultations can carry out a final statement of which only she is directly responsible through their independent intervention and referral (Triage Advanced). Pose, at once and consistently to the idea of teamwork, where they can be the limits to that intervention finalist and the circuits to follow. This paper proposes a definition line of one of those situations through triage concepts universally tested, and takes full advantage of advanced practice profile offered by nurses Device Critical Care (DCCU) of the SSPA and any the emerging legal and regulatory framework in terms of standardized collaborative prescription, us know legitimate receivers. This work stems from the vision of professionals and our contribution to that line of institutional work that must be consensus.
Resumo:
El programa tracta de fer transformacions de linies simples amb informació en grafs més visuals, definint carrils, simbologies de carril i linies de divisió de trams.
Resumo:
The EVS4CSCL project starts in the context of a Computer Supported Collaborative Learning environment (CSCL). Previous UOC projects created a CSCL generic platform (CLPL) to facilitate the development of CSCL applications. A discussion forum (DF) was the first application developed over the framework. This discussion forum was different from other products on the marketplace because of its focus on the learning process. The DF carried out the specification and elaboration phases from the discussion learning process but there was a lack in the consensus phase. The consensus phase in a learning environment is not something to be achieved but tested. Common tests are done by Electronic Voting System (EVS) tools, but consensus test is not an assessment test. We are not evaluating our students by their answers but by their discussion activity. Our educational EVS would be used as a discussion catalyst proposing a discussion about the results after an initial query or it would be used after a discussion period in order to manifest how the discussion changed the students mind (consensus). It should be also used by the teacher as a quick way to know where the student needs some reinforcement. That is important in a distance-learning environment where there is no direct contact between the teacher and the student and it is difficult to detect the learning lacks. In an educational environment, assessment it is a must and the EVS will provide direct assessment by peer usefulness evaluation, teacher marks on every query created and indirect assessment from statistics regarding the user activity.
Resumo:
Study on the likelihood and prevalence of patients with copd, over a year in a family medicine consultation, during 2012 and first two months of 2013. In a query of a health center about 15oo patients every 6 months probabilistic evolution was studied according to the theory of Laplace. Analyze both the COPD, its symptoms, etiology, clinical consultation and treatment in Family Medicine.
Resumo:
The Andalusian Public Health System Virtual Library (Biblioteca Virtual del Sistema Sanitario Público de Andalucía, BV-SSPA) provides access to health information resources and services to healthcare professionals through its Website. This virtual environment demands higher users’ knowledge in order to satisfy of the need of information of our users, as digital natives as digital immigrants, improving at the same time the communication with all of them. 1. To collect clients' views and expectations according to their nature of digital natives and immigrants. 2. To know our online reputation. A Collecting User Expectation Questionnaire will be built, taking into account the segmentation of the BV-SSPA users’ professional groups of the Andalusian Public Health System. A pilot test will be run to check the survey dimensions and items about practices, attitudes and knowledge of our users. Two Quality Function Deployment (QFD) matrices will enable the BV-SSPA services to be targeted to our digital natives or digital immigrants, according to their nature, finding the best way to satisfy their information needs. We provide feedback on BV-SSPA: users can have the opportunity to post feedback about the site via the 'Contact us' section and comment about their experience. And Web 2.0 is a shop window, providing the opportunity to show the comments; and through time, our online reputation will be built, but the BV-SSPA must manage its own personal branding. Web 2.0 tools are a driver of improvement, because they provide a key source of insight into people's attitudes. Besides, the BV-SSPA digital identity will be analyzed through indicators like major search engine referrals breakdown, top referring sites (non search engines), or top search engine referral phrases, among others. Definition of digital native and digital immigrant profiles of the BV-SSPA, and their difference, will be explained by their expectations. The design of the two QFD matrices will illustrate in just one graph the requirements of both groups for tackling digital abilities and inequalities. The BV-SSPA could deliver information and services through alternative channels. On the other hand, we are developing a strategy to identify, to measure and to manage a digital identity through communication with the user and to find out our online reputation. With the use of different tools from quantitative and qualitative methodology, and the opportunities offered by Web 2.0 tools, the BV-SSPA will know the expectations of their users as a first step to satisfy their necessities. Personalization is pivotal to the success of the Site, delivering tailored content to individuals based on their recorded preferences. The valuable user research can be used during new product development and redesign. Besides positive interaction let us build trust, show authenticity, and foster loyalty: we improve with effort, communication and show.
Resumo:
Motivated by the work of Mateu, Orobitg, Pérez and Verdera, who proved inequalities of the form $T_*f\lesssim M(Tf)$ or $T_*f\lesssim M^2(Tf)$ for certain singular integral operators $T$, such as the Hilbert or the Beurling transforms, we study the possibility of establishing this type of control for the Cauchy transform along a Lipschitz graph. We show that this is not possible in general, and we give a partial positive result when the graph is substituted by a Jordan curve.
Resumo:
Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.
Resumo:
In epidemiologic studies, measurement error in dietary variables often attenuates association between dietary intake and disease occurrence. To adjust for the attenuation caused by error in dietary intake, regression calibration is commonly used. To apply regression calibration, unbiased reference measurements are required. Short-term reference measurements for foods that are not consumed daily contain excess zeroes that pose challenges in the calibration model. We adapted two-part regression calibration model, initially developed for multiple replicates of reference measurements per individual to a single-replicate setting. We showed how to handle excess zero reference measurements by two-step modeling approach, how to explore heteroscedasticity in the consumed amount with variance-mean graph, how to explore nonlinearity with the generalized additive modeling (GAM) and the empirical logit approaches, and how to select covariates in the calibration model. The performance of two-part calibration model was compared with the one-part counterpart. We used vegetable intake and mortality data from European Prospective Investigation on Cancer and Nutrition (EPIC) study. In the EPIC, reference measurements were taken with 24-hour recalls. For each of the three vegetable subgroups assessed separately, correcting for error with an appropriately specified two-part calibration model resulted in about three fold increase in the strength of association with all-cause mortality, as measured by the log hazard ratio. Further found is that the standard way of including covariates in the calibration model can lead to over fitting the two-part calibration model. Moreover, the extent of adjusting for error is influenced by the number and forms of covariates in the calibration model. For episodically consumed foods, we advise researchers to pay special attention to response distribution, nonlinearity, and covariate inclusion in specifying the calibration model.
Resumo:
INTRODUCTION: Gain weight after transplantation is relatively common, also tends to be multifactorial and can be influenced by glucocorticoids and immunosuppressive medications, delayed graft function and cause serious health complications. OBJECTIVES: Assess changes in weight, degree of obesity and body mass index as well as the effect of immunosuppressive treatment over these 5 years after kidney transplantation. METHODS: The samples were 119 kidney transplant recipients, 70 men and 49 women, that attended the query post for five years. All patients were measured Pretransplant and post (from 1st year to the 5th year) weight, height and body mass index calculated by the formula weight/size2 relating it to immunosuppressive treatment taking. RESULTS: There is a considerable increase of body mass index, weight and degree of obesity in the first year after transplantation to increase more slowly in the next four years. The type of immunosuppressive treatment influence the weight and degree of obesity that occurs in this period of time. CONCLUSIONS: A high prevalence there are overweight and obesity after the transplant especially during the first year. A year patients earn an average of 6.6 kg in weight and an average of 2.5 kg/m2 in their BMI. During treatment should minimize doses of steroids and include dietary treatment and adequate physical exercise
Resumo:
During the Early Toarcian, major paleoenvironnemental and paleoceanographical changes occurred, leading to an oceanic anoxic event (OAE) and to a perturbation of the carbon isotope cycle. Although the standard biochronology of the Lower Jurassic is essentially based upon ammonites, in recent years biostratigraphy based on calcareous nannofossils and dinoflagellate cysts is increasingly used to date Jurassic rocks. However, the precise dating and correlation of the Early Toarcian OAE, and of the associated delta C-13 anomaly in different settings of the western Tethys, are still partly problematic, and it is still unclear whether these events are synchronous or not. In order to allow more accurate correlations of the organic rich levels recorded in the Lower Toarcian OAE, this account proposes a new biozonation based on a quantitative biochronology approach, the Unitary Associations (UA), applied to calcareous nannofossils. This study represents the first attempt to apply the UA method to Jurassic nannofossils. The study incorporates eighteen sections distributed across western Tethys and ranging from the Pliensbachian to Aalenian, comprising 1220 samples and 72 calcareous nannofossil taxa. The BioGraph [Savary, J., Guex, J., 1999. Discrete biochronological scales and unitary associations: description of the Biograph Computer program. Memoires de Geologie de Lausanne 34, 282 pp] and UA-Graph (Copyright Hammer O., Guex and Savary, 2002) softwares provide a discrete biochronological framework based upon multi-taxa concurrent range zones in the different sections. The optimized dataset generates nine UAs using the co-occurrences of 56 taxa. These UAs are grouped into six Unitary Association Zones (UA-Z), which constitute a robust biostratigraphic synthesis of all the observed or deduced biostratigraphic relationships between the analysed taxa. The UA zonation proposed here is compared to ``classic'' calcareous nannofossil biozonations, which are commonly used for the southern and the northern sides of Tethys. The biostratigraphic resolution of the UA-Zones varies from one nannofossil subzone or part of it to several subzones, and can be related to the pattern of calcareous nannoplankton originations and extinctions during the studied time interval. The Late Pliensbachian - Early Toarcian interval (corresponding to the UA-Z II) represents a major step in the Jurassic nannoplankton radiation. The recognized UA-Zones are also compared to the carbon isotopic negative excursion and TOC maximum in five sections of central Italy, Germany and England, with the aim of providing a more reliable correlation tool for the Early Toarcian OAE, and of the associated isotopic anomaly, between the southern and northern part of western Tethys. The results of this work show that the TOC maximum and delta C-13 negative excursion correspond to the upper part of the UA-Z II (i.e., UA 3) in the sections analysed. This suggests that the Early Toarcian OAE was a synchronous event within the western Tethys. (c) 2006 Elsevier B.V. All rights reserved.