985 resultados para self-deployment algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless sensor network (WSN) Is a technology that can be used to monitor and actuate on environments in a non-intrusive way. The main difference from WSN and traditional sensor networks is the low dependability of WSN nodes. In this way, WSN solutions are based on a huge number of cheap tiny nodes that can present faults in hardware, software and wireless communication. The deployment of hundreds of nodes can overcome the low dependability of individual nodes, however this strategy introduces a lot of challenges regarding network management, real-time requirements and self-optimization. In this paper we present a simulated annealing approach that self-optimize large scale WSN. Simulation results indicate that our approach can achieve self-optimization characteristics in a dynamic WSN. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider a one-dimensional environment with N randomly distributed sites. An agent explores this random medium moving deterministically with a spatial memory μ. A crossover from local to global exploration occurs in one dimension at a well-defined memory value μ1=log2N. In its stochastic version, the dynamics is ruled by the memory and by temperature T, which affects the hopping displacement. This dynamics also shows a crossover in one dimension, obtained computationally, between exploration schemes, characterized yet by the trajectory size (Np) (aging effect). In this paper we provide an analytical approach considering the modified stochastic version where the parameter T plays the role of a maximum hopping distance. This modification allows us to obtain a general analytical expression for the crossover, as a function of the parameters μ, T, and Np. Differently from what has been proposed by previous studies, we find that the crossover occurs in any dimension d. These results have been validated by numerical experiments and may be of great value for fixing optimal parameters in search algorithms. © 2013 American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, with the competitiveness that is seen in the market, it is crucial to the success of the business, develop new strategies to keep and win new customer preference. To ensure the success of a particular service or product, the secret is to continually meet the wishes and demands of the customers, which are the key parts of the business, through innovation, variety and quality assurance. To achieve this goal managers should be aware of all types of process that exist in the company, as they are primarily responsible and interested by quality service, customer satisfaction and consequently, generating favorable financial results. A tool used to ensure good results to business is the Quality Function Deployment (QFD) that seeks to hear and interpret customers requirements and turn them into essential features for a project

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A self-learning simulated annealing algorithm is developed by combining the characteristics of simulated annealing and domain elimination methods. The algorithm is validated by using a standard mathematical function and by optimizing the end region of a practical power transformer. The numerical results show that the CPU time required by the proposed method is about one third of that using conventional simulated annealing algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Warfarin-dosing pharmacogenetic algorithms have presented different performances across ethnicities, and the impact in admixed populations is not fully known. Aims: To evaluate the CYP2C9 and VKORC1 polymorphisms and warfarin-predicted metabolic phenotypes according to both self-declared ethnicity and genetic ancestry in a Brazilian general population plus Amerindian groups. Methods: Two hundred twenty-two Amerindians (Tupinikin and Guarani) were enrolled and 1038 individuals from the Brazilian general population who were self-declared as White, Intermediate (Brown, Pardo in Portuguese), or Black. Samples of 274 Brazilian subjects from Sao Paulo were analyzed for genetic ancestry using an Affymetrix 6.0 (R) genotyping platform. The CYP2C9*2 (rs1799853), CYP2C9*3 (rs1057910), and VKORC1 g.-1639G>A (rs9923231) polymorphisms were genotyped in all studied individuals. Results: The allelic frequency for the VKORC1 polymorphism was differently distributed according to self-declared ethnicity: White (50.5%), Intermediate (46.0%), Black (39.3%), Tupinikin (40.1%), and Guarani (37.3%) (p < 0.001), respectively. The frequency of intermediate plus poor metabolizers (IM + PM) was higher in White (28.3%) than in Intermediate (22.7%), Black (20.5%), Tupinikin (12.9%), and Guarani (5.3%), (p < 0.001). For the samples with determined ancestry, subjects carrying the GG genotype for the VKORC1 had higher African ancestry and lower European ancestry (0.14 +/- 0.02 and 0.62 +/- 0.02) than in subjects carrying AA (0.05 +/- 0.01 and 0.73 +/- 0.03) (p = 0.009 and 0.03, respectively). Subjects classified as IM + PM had lower African ancestry (0.08 +/- 0.01) than extensive metabolizers (0.12 +/- 0.01) (p = 0.02). Conclusions: The CYP2C9 and VKORC1 polymorphisms are differently distributed according to self-declared ethnicity or genetic ancestry in the Brazilian general population plus Amerindians. This information is an initial step toward clinical pharmacogenetic implementation, and it could be very useful in strategic planning aiming at an individual therapeutic approach and an adverse drug effect profile prediction in an admixed population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Peer-to-Peer network paradigm is drawing the attention of both final users and researchers for its features. P2P networks shift from the classic client-server approach to a high level of decentralization where there is no central control and all the nodes should be able not only to require services, but to provide them to other peers as well. While on one hand such high level of decentralization might lead to interesting properties like scalability and fault tolerance, on the other hand it implies many new problems to deal with. A key feature of many P2P systems is openness, meaning that everybody is potentially able to join a network with no need for subscription or payment systems. The combination of openness and lack of central control makes it feasible for a user to free-ride, that is to increase its own benefit by using services without allocating resources to satisfy other peers’ requests. One of the main goals when designing a P2P system is therefore to achieve cooperation between users. Given the nature of P2P systems based on simple local interactions of many peers having partial knowledge of the whole system, an interesting way to achieve desired properties on a system scale might consist in obtaining them as emergent properties of the many interactions occurring at local node level. Two methods are typically used to face the problem of cooperation in P2P networks: 1) engineering emergent properties when designing the protocol; 2) study the system as a game and apply Game Theory techniques, especially to find Nash Equilibria in the game and to reach them making the system stable against possible deviant behaviors. In this work we present an evolutionary framework to enforce cooperative behaviour in P2P networks that is alternative to both the methods mentioned above. Our approach is based on an evolutionary algorithm inspired by computational sociology and evolutionary game theory, consisting in having each peer periodically trying to copy another peer which is performing better. The proposed algorithms, called SLAC and SLACER, draw inspiration from tag systems originated in computational sociology, the main idea behind the algorithm consists in having low performance nodes copying high performance ones. The algorithm is run locally by every node and leads to an evolution of the network both from the topology and from the nodes’ strategy point of view. Initial tests with a simple Prisoners’ Dilemma application show how SLAC is able to bring the network to a state of high cooperation independently from the initial network conditions. Interesting results are obtained when studying the effect of cheating nodes on SLAC algorithm. In fact in some cases selfish nodes rationally exploiting the system for their own benefit can actually improve system performance from the cooperation formation point of view. The final step is to apply our results to more realistic scenarios. We put our efforts in studying and improving the BitTorrent protocol. BitTorrent was chosen not only for its popularity but because it has many points in common with SLAC and SLACER algorithms, ranging from the game theoretical inspiration (tit-for-tat-like mechanism) to the swarms topology. We discovered fairness, meant as ratio between uploaded and downloaded data, to be a weakness of the original BitTorrent protocol and we drew inspiration from the knowledge of cooperation formation and maintenance mechanism derived from the development and analysis of SLAC and SLACER, to improve fairness and tackle freeriding and cheating in BitTorrent. We produced an extension of BitTorrent called BitFair that has been evaluated through simulation and has shown the abilities of enforcing fairness and tackling free-riding and cheating nodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most interesting challenge of the next years will be the Air Space Systems automation. This process will involve different aspects as the Air Traffic Management, the Aircrafts and Airport Operations and the Guidance and Navigation Systems. The use of UAS (Uninhabited Aerial System) for civil mission will be one of the most important steps in this automation process. In civil air space, Air Traffic Controllers (ATC) manage the air traffic ensuring that a minimum separation between the controlled aircrafts is always provided. For this purpose ATCs use several operative avoidance techniques like holding patterns or rerouting. The use of UAS in these context will require the definition of strategies for a common management of piloted and piloted air traffic that allow the UAS to self separate. As a first employment in civil air space we consider a UAS surveillance mission that consists in departing from a ground base, taking pictures over a set of mission targets and coming back to the same ground base. During all mission a set of piloted aircrafts fly in the same airspace and thus the UAS has to self separate using the ATC avoidance as anticipated. We consider two objective, the first consists in the minimization of the air traffic impact over the mission, the second consists in the minimization of the impact of the mission over the air traffic. A particular version of the well known Travelling Salesman Problem (TSP) called Time-Dependant-TSP has been studied to deal with traffic problems in big urban areas. Its basic idea consists in a cost of the route between two clients depending on the period of the day in which it is crossed. Our thesis supports that such idea can be applied to the air traffic too using a convenient time horizon compatible with aircrafts operations. The cost of a UAS sub-route will depend on the air traffic that it will meet starting such route in a specific moment and consequently on the avoidance maneuver that it will use to avoid that conflict. The conflict avoidance is a topic that has been hardly developed in past years using different approaches. In this thesis we purpose a new approach based on the use of ATC operative techniques that makes it possible both to model the UAS problem using a TDTSP framework both to use an Air Traffic Management perspective. Starting from this kind of mission, the problem of the UAS insertion in civil air space is formalized as the UAS Routing Problem (URP). For this reason we introduce a new structure called Conflict Graph that makes it possible to model the avoidance maneuvers and to define the arc cost function of the departing time. Two Integer Linear Programming formulations of the problem are proposed. The first is based on a TDTSP formulation that, unfortunately, is weaker then the TSP formulation. Thus a new formulation based on a TSP variation that uses specific penalty to model the holdings is proposed. Different algorithms are presented: exact algorithms, simple heuristics used as Upper Bounds on the number of time steps used, and metaheuristic algorithms as Genetic Algorithm and Simulated Annealing. Finally an air traffic scenario has been simulated using real air traffic data in order to test our algorithms. Graphic Tools have been used to represent the Milano Linate air space and its air traffic during different days. Such data have been provided by ENAV S.p.A (Italian Agency for Air Navigation Services).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Finite element techniques for solving the problem of fluid-structure interaction of an elastic solid material in a laminar incompressible viscous flow are described. The mathematical problem consists of the Navier-Stokes equations in the Arbitrary Lagrangian-Eulerian formulation coupled with a non-linear structure model, considering the problem as one continuum. The coupling between the structure and the fluid is enforced inside a monolithic framework which computes simultaneously for the fluid and the structure unknowns within a unique solver. We used the well-known Crouzeix-Raviart finite element pair for discretization in space and the method of lines for discretization in time. A stability result using the Backward-Euler time-stepping scheme for both fluid and solid part and the finite element method for the space discretization has been proved. The resulting linear system has been solved by multilevel domain decomposition techniques. Our strategy is to solve several local subproblems over subdomain patches using the Schur-complement or GMRES smoother within a multigrid iterative solver. For validation and evaluation of the accuracy of the proposed methodology, we present corresponding results for a set of two FSI benchmark configurations which describe the self-induced elastic deformation of a beam attached to a cylinder in a laminar channel flow, allowing stationary as well as periodically oscillating deformations, and for a benchmark proposed by COMSOL multiphysics where a narrow vertical structure attached to the bottom wall of a channel bends under the force due to both viscous drag and pressure. Then, as an example of fluid-structure interaction in biomedical problems, we considered the academic numerical test which consists in simulating the pressure wave propagation through a straight compliant vessel. All the tests show the applicability and the numerical efficiency of our approach to both two-dimensional and three-dimensional problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Costly on-site node repairs in wireless mesh networks (WMNs) can be required due to misconfiguration, corrupt software updates, or unavailability during updates. We propose ADAM as a novel management framework that guarantees accessibility of individual nodes in these situations. ADAM uses a decentralised distribution mechanism and self-healing mechanisms for safe configuration and software updates. In order to implement the ADAM management and self-healing mechanisms, an easy-to-learn and extendable build system for a small footprint embedded Linux distribution for WMNs has been developed. The paper presents the ADAM concept, the build system for the Linux distribution and the management architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: We sought to determine both the procedural performance and safety of percutaneous implantation of the second (21-French [F])- and third (18-F)-generation CoreValve aortic valve prosthesis (CoreValve Inc., Irvine, California). BACKGROUND: Percutaneous aortic valve replacement represents an emerging alternative therapy for high-risk and inoperable patients with severe symptomatic aortic valve stenosis. METHODS: Patients with: 1) symptomatic, severe aortic valve stenosis (area <1 cm2); 2) age > or =80 years with a logistic EuroSCORE > or =20% (21-F group) or age > or =75 years with a logistic EuroSCORE > or =15% (18-F group); or 3) age > or =65 years plus additional prespecified risk factors were included. Introduction of the 18-F device enabled the transition from a multidisciplinary approach involving general anesthesia, surgical cut-down, and cardiopulmonary bypass to a truly percutaneous approach under local anesthesia without hemodynamic support. RESULTS: A total of 86 patients (21-F, n = 50; 18-F, n = 36) with a mean valve area of 0.66 +/- 0.19 cm2 (21-F) and 0.54 +/- 0.15 cm2 (18-F), a mean age of 81.3 +/- 5.2 years (21-F) and 83.4 +/- 6.7 years (18-F), and a mean logistic EuroSCORE of 23.4 +/- 13.5% (21-F) and 19.1 +/- 11.1% (18-F) were recruited. Acute device success was 88%. Successful device implantation resulted in a marked reduction of aortic transvalvular gradients (mean pre 43.7 mm Hg vs. post 9.0 mm Hg, p < 0.001) with aortic regurgitation grade remaining unchanged. Acute procedural success rate was 74% (21-F: 78%; 18-F: 69%). Procedural mortality was 6%. Overall 30-day mortality rate was 12%; the combined rate of death, stroke, and myocardial infarction was 22%. CONCLUSIONS: Treatment of severe aortic valve stenosis in high-risk patients with percutaneous implantation of the CoreValve prosthesis is feasible and associated with a lower mortality rate than predicted by risk algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The numerical solution of the incompressible Navier-Stokes Equations offers an effective alternative to the experimental analysis of Fluid-Structure interaction i.e. dynamical coupling between a fluid and a solid which otherwise is very complex, time consuming and very expensive. To have a method which can accurately model these types of mechanical systems by numerical solutions becomes a great option, since these advantages are even more obvious when considering huge structures like bridges, high rise buildings, or even wind turbine blades with diameters as large as 200 meters. The modeling of such processes, however, involves complex multiphysics problems along with complex geometries. This thesis focuses on a novel vorticity-velocity formulation called the KLE to solve the incompressible Navier-stokes equations for such FSI problems. This scheme allows for the implementation of robust adaptive ODE time integration schemes and thus allows us to tackle the various multiphysics problems as separate modules. The current algorithm for KLE employs a structured or unstructured mesh for spatial discretization and it allows the use of a self-adaptive or fixed time step ODE solver while dealing with unsteady problems. This research deals with the analysis of the effects of the Courant-Friedrichs-Lewy (CFL) condition for KLE when applied to unsteady Stoke’s problem. The objective is to conduct a numerical analysis for stability and, hence, for convergence. Our results confirmthat the time step ∆t is constrained by the CFL-like condition ∆t ≤ const. hα, where h denotes the variable that represents spatial discretization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: To detect attention deficit hyperactivity disorder (ADHD) in treatment seeking substance use disorders (SUD) patients, a valid screening instrument is needed. Objectives: To test the performance of the Adult ADHD Self-Report Scale V 1.1(ASRS) for adult ADHD in an international sample of treatment seeking SUD patients for DSM-IV-TR; for the proposed DSM-5 criteria; in different subpopulations, at intake and 1–2 weeks after intake; using different scoring algorithms; and different externalizing disorders as external criterion (including adult ADHD, bipolar disorder, antisocial and borderline personality disorder). Methods: In 1138 treatment seeking SUD subjects, ASRS performance was determined using diagnoses based on Conner's Adult ADHD Diagnostic Interview for DSM-IV (CAADID) as gold standard. Results: The prevalence of adult ADHD was 13.0% (95% CI: 11.0–15.0%). The overall positive predictive value (PPV) of the ASRS was 0.26 (95% CI: 0.22–0.30), the negative predictive value (NPV) was 0.97 (95% CI: 0.96–0.98). The sensitivity (0.84, 95% CI: 0.76–0.88) and specificity (0.66, 95% CI: 0.63–0.69) measured at admission were similar to the sensitivity (0.88, 95% CI: 0.83–0.93) and specificity (0.67, 95% CI: 0.64–0.70) measured 2 weeks after admission. Sensitivity was similar, but specificity was significantly better in patients with alcohol compared to (illicit) drugs as the primary substance of abuse (0.76 vs. 0.56). ASRS was not a good screener for externalizing disorders other than ADHD. Conclusions: The ASRS is a sensitive screener for identifying possible ADHD cases with very few missed cases among those screening negative in this population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Voluntary control of information processing is crucial to allocate resources and prioritize the processes that are most important under a given situation; the algorithms underlying such control, however, are often not clear. We investigated possible algorithms of control for the performance of the majority function, in which participants searched for and identified one of two alternative categories (left or right pointing arrows) as composing the majority in each stimulus set. We manipulated the amount (set size of 1, 3, and 5) and content (ratio of left and right pointing arrows within a set) of the inputs to test competing hypotheses regarding mental operations for information processing. Using a novel measure based on computational load, we found that reaction time was best predicted by a grouping search algorithm as compared to alternative algorithms (i.e., exhaustive or self-terminating search). The grouping search algorithm involves sampling and resampling of the inputs before a decision is reached. These findings highlight the importance of investigating the implications of voluntary control via algorithms of mental operations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives The aim of this study was to quantify potential differences in count, frequency and pattern of high-intensity transient signals (HITS) during transapical transcatheter aortic valve implantation (TA-TAVI), by comparing the Symetis Acurate TA (SA) with the balloon-expandable Edwards Sapien XT (ES) system. Background Recently, the Symetis Acurate TA revalving system has been introduced for TA-TAVI. The Symetis Acurate TA aortic bioprosthesis is self-expanding and is deployed by a specific two-step implantation technique. Whether this novel method increases the load of intraprocedural emboli, detected by transcranial Doppler ultrasound (TCD) as HITS, or not is not clear. Methods Twenty-two patients (n = 11 in each study arm, median logistic EuroScore 20%, median STS score 7%) displayed continuous TCD signals of good quality throughout the entire TA-TAVI procedure and were included in the final analysis. Data are presented as median with interquartile ranges. Results No significant differences were detected in total procedural or interval-related HITS load (SA: 303 [200; 594], ES: 499 [285; 941]; p = 0.16). With both devices, HITS peaked during prosthesis deployment (PD), whereas significantly fewer HITS occurred during instrumentation (SA: p = 0.002; ES: <0.001) or post-implantation PI (SA: p = 0.007; ES: <0.001). PD-associated HITS amounted to almost half of the total HITS load. One patient suffered new disabling stroke at 30 days. Thirty-day mortality amounted to 13.6% (3 of 22 patients). Conclusions Simplified transapical delivery using the self-expanding SA device does not increase HITS, despite of a two-step deployment technique with more interactions with the native aortic valve, when compared to the balloon-expandable ES valve. The similarity in HITS count, frequency and pattern with the two systems suggests a common mechanism for the release of cerebral microemboli.