918 resultados para Node-Depth Encoding
Resumo:
The power loss reduction in distribution systems (DSs) is a nonlinear and multiobjective problem. Service restoration in DSs is even computationally hard since it additionally requires a solution in real-time. Both DS problems are computationally complex. For large-scale networks, the usual problem formulation has thousands of constraint equations. The node-depth encoding (NDE) enables a modeling of DSs problems that eliminates several constraint equations from the usual formulation, making the problem solution simpler. On the other hand, a multiobjective evolutionary algorithm (EA) based on subpopulation tables adequately models several objectives and constraints, enabling a better exploration of the search space. The combination of the multiobjective EA with NDE (MEAN) results in the proposed approach for solving DSs problems for large-scale networks. Simulation results have shown the MEAN is able to find adequate restoration plans for a real DS with 3860 buses and 632 switches in a running time of 0.68 s. Moreover, the MEAN has shown a sublinear running time in function of the system size. Tests with networks ranging from 632 to 5166 switches indicate that the MEAN can find network configurations corresponding to a power loss reduction of 27.64% for very large networks requiring relatively low running time.
Resumo:
Network reconfiguration for service restoration (SR) in distribution systems is a complex optimization problem. For large-scale distribution systems, it is computationally hard to find adequate SR plans in real time since the problem is combinatorial and non-linear, involving several constraints and objectives. Two Multi-Objective Evolutionary Algorithms that use Node-Depth Encoding (NDE) have proved able to efficiently generate adequate SR plans for large distribution systems: (i) one of them is the hybridization of the Non-Dominated Sorting Genetic Algorithm-II (NSGA-II) with NDE, named NSGA-N; (ii) the other is a Multi-Objective Evolutionary Algorithm based on subpopulation tables that uses NDE, named MEAN. Further challenges are faced now, i.e. the design of SR plans for larger systems as good as those for relatively smaller ones and for multiple faults as good as those for one fault (single fault). In order to tackle both challenges, this paper proposes a method that results from the combination of NSGA-N, MEAN and a new heuristic. Such a heuristic focuses on the application of NDE operators to alarming network zones according to technical constraints. The method generates similar quality SR plans in distribution systems of significantly different sizes (from 3860 to 30,880 buses). Moreover, the number of switching operations required to implement the SR plans generated by the proposed method increases in a moderate way with the number of faults.
Resumo:
In this paper, we consider a scenario where 3D scenes are modeled through a View+Depth representation. This representation is to be used at the rendering side to generate synthetic views for free viewpoint video. The encoding of both type of data (view and depth) is carried out using two H.264/AVC encoders. In this scenario we address the reduction of the encoding complexity of depth data. Firstly, an analysis of the Mode Decision and Motion Estimation processes has been conducted for both view and depth sequences, in order to capture the correlation between them. Taking advantage of this correlation, we propose a fast mode decision and motion estimation algorithm for the depth encoding. Results show that the proposed algorithm reduces the computational burden with a negligible loss in terms of quality of the rendered synthetic views. Quality measurements have been conducted using the Video Quality Metric.
Resumo:
Pós-graduação em Engenharia Elétrica - FEIS
Resumo:
The design of a network is a solution to several engineering and science problems. Several network design problems are known to be NP-hard, and population-based metaheuristics like evolutionary algorithms (EAs) have been largely investigated for such problems. Such optimization methods simultaneously generate a large number of potential solutions to investigate the search space in breadth and, consequently, to avoid local optima. Obtaining a potential solution usually involves the construction and maintenance of several spanning trees, or more generally, spanning forests. To efficiently explore the search space, special data structures have been developed to provide operations that manipulate a set of spanning trees (population). For a tree with n nodes, the most efficient data structures available in the literature require time O(n) to generate a new spanning tree that modifies an existing one and to store the new solution. We propose a new data structure, called node-depth-degree representation (NDDR), and we demonstrate that using this encoding, generating a new spanning forest requires average time O(root n). Experiments with an EA based on NDDR applied to large-scale instances of the degree-constrained minimum spanning tree problem have shown that the implementation adds small constants and lower order terms to the theoretical bound.
Resumo:
In this paper, to solve the reconfiguration problem of radial distribution systems a scatter search, which is a metaheuristic-based algorithm, is proposed. In the codification process of this algorithm a structure called node-depth representation is used. It then, via the operators and from the electrical power system point of view, results finding only radial topologies. In order to show the effectiveness, usefulness, and the efficiency of the proposed method, a commonly used test system, 135-bus, and a practical system, a part of Sao Paulo state's distribution network, 7052 bus, are conducted. Results confirm the efficiency of the proposed algorithm that can find high quality solutions satisfying all the physical and operational constraints of the problem.
Resumo:
The evolution and population dynamics of avian coronaviruses (AvCoVs) remain underexplored. In the present study, in-depth phylogenetic and Bayesian phylogeographic studies were conducted to investigate the evolutionary dynamics of AvCoVs detected in wild and synanthropic birds. A total of 500 samples, including tracheal and cloacal swabs collected from 312 wild birds belonging to 42 species, were analysed using molecular assays. A total of 65 samples (13%) from 22 bird species were positive for AvCoV. Molecular evolution analyses revealed that the sequences from samples collected in Brazil did not cluster with any of the AvCoV S1 gene sequences deposited in the GenBank database. Bayesian framework analysis estimated an AvCoV strain from Sweden (1999) as the most recent common ancestor of the AvCoVs detected in this study. Furthermore, the analysis inferred an increase in the AvCoV dynamic demographic population in different wild and synanthropic bird species, suggesting that birds may be potential new hosts responsible for spreading this virus.
Resumo:
Abstract Objectives to evaluate risk factors for recurrence of carcinoma of the uterine cervix among women who had undergone radical hysterectomy without pelvic lymph node metastasis, while taking into consideration not only the classical histopathological factors but also sociodemographic, clinical and treatment-related factors. Study desin This was an exploratory analysis on 233 women with carcinoma of the uterine cervix (stages IB and IIA) who were treated by means of radical hysterectomy and pelvic lymphadenectomy, with free surgical margins and without lymph node metastases on conventional histopathological examination. Women with histologically normal lymph nodes but with micrometastases in the immunohistochemical analysis (AE1/AE3) were excluded. Disease-free survival for sociodemographic, clinical and histopathological variables was calculated using the Kaplan-Meier method. The Cox proportional hazards model was used to identify the independent risk factors for recurrence. Twenty-seven recurrences were recorded (11.6%), of which 18 were pelvic, four were distant, four were pelvic + distant and one was of unknown location. The five-year disease-free survival rate among the study population was 88.4%. The independent risk factors for recurrence in the multivariate analysis were: postmenopausal status (HR 14.1; 95% CI: 3.7-53.6; P < 0.001), absence of or slight inflammatory reaction (HR 7.9; 95% CI: 1.7-36.5; P = 0.008) and invasion of the deepest third of the cervix (HR 6.1; 95% CI: 1.3-29.1; P = 0.021). Postoperative radiotherapy was identified as a protective factor against recurrence (HR 0.02; 95% CI: 0.001-0.25; P = 0.003). (To continue) Postmenopausal status is a possible independent risk factor for recurrence even when adjusted for classical prognostic factors (such as tumour size, depth of tumour invasion, capillary embolisation) and treatment-related factors (period of treatment and postoperative radiotherapy status)
Resumo:
Background: Depth of tumor invasion (T-category) and the number of metastatic lymph nodes (N-category) are the most important prognostic factors in patients with gastric cancer. Recently, the ratio between metastatic and dissected lymph nodes (N-ratio) has been established as one. The aim of this study is to evaluate the impact of N-ratio and its interaction with N-category as a prognostic factor in gastric cancer. Methods: This was a retrospective study in which we reviewed clinical and pathological data of 165 patients who had undergone curative surgery at our institution through a 9-year period. The exclusion criteria included metastases, gastric stump tumors and gastrectomy with less than 15 lymph nodes dissected. Results: The median age of the patients was 63 years and most of them were male. Total gastrectomy was the most common procedure and 92.1% of the patients had a D2-lymphadenectomy. Their 5-year overall survival was 57.7%. T-category, N-category, extended gastrectomy, and N-ratio were prognostic factors in overall and disease-free survival in accordance with univariate analysis. In accordance with TNM staging, N1 patients who have had NR1 had 5-year survival in 75.5% whereas in the NR2 group only 33% of the cases had 5-year survival. In the multivariate analysis, the interaction between N-category and N-ratio was an independent prognostic factor. Conclusion: Our findings confirmed the role of N-ratio as prognostic factor of survival in patients with gastric cancer surgically treated with at least 15 lymph nodes dissected. The relationship between N-category and N-ratio is a better predictor than lymph node metastasis staging. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Charcot-Marie-Tooth disease type 4C (CMT4C) is an early-onset, autosomal recessive form of demyelinating neuropathy. The clinical manifestations include progressive scoliosis, delayed age of walking, muscular atrophy, distal weakness, and reduced nerve conduction velocity. The gene mutated in CMT4C disease, SH3TC2/KIAA1985, was recently identified; however, the function of the protein it encodes remains unknown. We have generated knockout mice where the first exon of the Sh3tc2 gene is replaced with an enhanced GFP cassette. The Sh3tc2(DeltaEx1/DeltaEx1) knockout animals develop progressive peripheral neuropathy manifested by decreased motor and sensory nerve conduction velocity and hypomyelination. We show that Sh3tc2 is specifically expressed in Schwann cells and localizes to the plasma membrane and to the perinuclear endocytic recycling compartment, concordant with its possible function in myelination and/or in regions of axoglial interactions. Concomitantly, transcriptional profiling performed on the endoneurial compartment of peripheral nerves isolated from control and Sh3tc2(DeltaEx1/DeltaEx1) animals uncovered changes in transcripts encoding genes involved in myelination and cell adhesion. Finally, detailed analyses of the structures composed of compact and noncompact myelin in the peripheral nerve of Sh3tc2(DeltaEx1/DeltaEx1) animals revealed abnormal organization of the node of Ranvier, a phenotype that we confirmed in CMT4C patient nerve biopsies. The generated Sh3tc2 knockout mice thus present a reliable model of CMT4C neuropathy that was instrumental in establishing a role for Sh3tc2 in myelination and in the integrity of the node of Ranvier, a morphological phenotype that can be used as an additional CMT4C diagnostic marker.
Resumo:
The atrioventricular (AV) node is permanently damaged in approximately 3% of congenital heart surgery operations, requiring implantation of a permanent pacemaker. Improvements in pacemaker design and in alternative treatment modalities require an effective in vivo model of complete heart block (CHB) before testing can be performed in humans. Such a model should enable accurate, reliable, and detectable induction of the surgical pathology. Through our laboratory’s efforts in developing a tissue engineering therapy for CHB, we describe here an improved in vivo model for inducing chronic AV block. The method employs a right thoracotomy in the adult rabbit, from which the right atrial appendage may be retracted to expose an access channel for the AV node. A novel injection device was designed, which both physically restricts needle depth and provides electrical information via electrocardiogram interface. This combination of features provides real-time guidance to the researcher for confirming contact with the AV node, and documents its ablation upon formalin injection. While all animals tested could be induced to acute AV block, those with ECG guidance were more likely to maintain chronic heart block >12 h. Our model enables the researcher to reproduce both CHB and the associated peripheral fibrosis that would be present in an open congenital heart surgery, and which would inevitably impact the design and utility of a tissue engineered AV node replacement.
Resumo:
Complex networks have recently attracted a significant amount of research attention due to their ability to model real world phenomena. One important problem often encountered is to limit diffusive processes spread over the network, for example mitigating pandemic disease or computer virus spread. A number of problem formulations have been proposed that aim to solve such problems based on desired network characteristics, such as maintaining the largest network component after node removal. The recently formulated critical node detection problem aims to remove a small subset of vertices from the network such that the residual network has minimum pairwise connectivity. Unfortunately, the problem is NP-hard and also the number of constraints is cubic in number of vertices, making very large scale problems impossible to solve with traditional mathematical programming techniques. Even many approximation algorithm strategies such as dynamic programming, evolutionary algorithms, etc. all are unusable for networks that contain thousands to millions of vertices. A computationally efficient and simple approach is required in such circumstances, but none currently exist. In this thesis, such an algorithm is proposed. The methodology is based on a depth-first search traversal of the network, and a specially designed ranking function that considers information local to each vertex. Due to the variety of network structures, a number of characteristics must be taken into consideration and combined into a single rank that measures the utility of removing each vertex. Since removing a vertex in sequential fashion impacts the network structure, an efficient post-processing algorithm is also proposed to quickly re-rank vertices. Experiments on a range of common complex network models with varying number of vertices are considered, in addition to real world networks. The proposed algorithm, DFSH, is shown to be highly competitive and often outperforms existing strategies such as Google PageRank for minimizing pairwise connectivity.
Resumo:
The Bloom filter is a space efficient randomized data structure for representing a set and supporting membership queries. Bloom filters intrinsically allow false positives. However, the space savings they offer outweigh the disadvantage if the false positive rates are kept sufficiently low. Inspired by the recent application of the Bloom filter in a novel multicast forwarding fabric, this paper proposes a variant of the Bloom filter, the optihash. The optihash introduces an optimization for the false positive rate at the stage of Bloom filter formation using the same amount of space at the cost of slightly more processing than the classic Bloom filter. Often Bloom filters are used in situations where a fixed amount of space is a primary constraint. We present the optihash as a good alternative to Bloom filters since the amount of space is the same and the improvements in false positives can justify the additional processing. Specifically, we show via simulations and numerical analysis that using the optihash the false positives occurrences can be reduced and controlled at a cost of small additional processing. The simulations are carried out for in-packet forwarding. In this framework, the Bloom filter is used as a compact link/route identifier and it is placed in the packet header to encode the route. At each node, the Bloom filter is queried for membership in order to make forwarding decisions. A false positive in the forwarding decision is translated into packets forwarded along an unintended outgoing link. By using the optihash, false positives can be reduced. The optimization processing is carried out in an entity termed the Topology Manger which is part of the control plane of the multicast forwarding fabric. This processing is only carried out on a per-session basis, not for every packet. The aim of this paper is to present the optihash and evaluate its false positive performances via simulations in order to measure the influence of different parameters on the false positive rate. The false positive rate for the optihash is then compared with the false positive probability of the classic Bloom filter.
Resumo:
Objectives: To evaluate risk factors for recurrence of carcinoma of the uterine cervix among women who had undergone radical hysterectomy without pelvic lymph node metastasis, while taking into consideration not only the classical histopathological factors but also sociodemographic, clinical and treatment-related factors. Study design: This was an exploratory analysis on 233 women with carcinoma of the uterine cervix (stages IB and IIA) who were treated by means of radical hysterectomy and pelvic lymphadenectomy, with free surgical margins and without lymph node metastases on conventional histopathological examination. Women with histologically normal lymph nodes but with micrometastases in the immunohistochemical analysis (AE1/AE3) were excluded. Disease-free survival for sociodemographic, clinical and histopathological variables was calculated using the Kaplan-Meier method. The Cox proportional hazards model was used to identify the independent risk factors for recurrence. Results: Twenty-seven recurrences were recorded (11.6%), of which 18 were pelvic, four were distant, four were pelvic + distant and one was of unknown location. The five-year disease-free survival rate among the study population was 88.4%. The independent risk factors for recurrence in the multivariate analysis were: postmenopausal status (HR 14.1; 95% CI: 3.7-53.6; P < 0.001), absence of or slight inflammatory reaction (HR 7.9; 95% CI: 1.7-36.5; P = 0.008) and invasion of the deepest third of the cervix (FIR 6.1; 95% CI: 1.3-29.1; P = 0.021). Postoperative radiotherapy was identified as a protective factor against recurrence (HR 0.02; 95% CI: 0.001-0.25; P = 0.003). Conclusion: Postmenopausal status is a possible independent risk factor for recurrence even when adjusted for classical prognostic factors (such as tumour size, depth of turnout invasion, capillary embolisation) and treatment-related factors (period of treatment and postoperative radiotherapy status). (C) 2009 Elsevier Ireland Ltd. All rights reserved.
Resumo:
We seek to determine the relationship between threshold and suprathreshold perception for position offset and stereoscopic depth perception under conditions that elevate their respective thresholds. Two threshold-elevating conditions were used: (1) increasing the interline gap and (2) dioptric blur. Although increasing the interline gap increases position (Vernier) offset and stereoscopic disparity thresholds substantially, the perception of suprathreshold position offset and stereoscopic depth remains unchanged. Perception of suprathreshold position offset also remains unchanged when the Vernier threshold is elevated by dioptric blur. We show that such normalization of suprathreshold position offset can be attributed to the topographical-map-based encoding of position. On the other hand, dioptric blur increases the stereoscopic disparity thresholds and reduces the perceived suprathreshold stereoscopic depth, which can be accounted for by a disparity-computation model in which the activities of absolute disparity encoders are multiplied by a Gaussian weighting function that is centered on the horopter. Overall, the statement "equal suprathreshold perception occurs in threshold-elevated and unelevated conditions when the stimuli are equally above their corresponding thresholds" describes the results better than the statement "suprathreshold stimuli are perceived as equal when they are equal multiples of their respective threshold values."