943 resultados para multi-user setting
Resumo:
Background: Oncologic outcomes in men with radiation-recurrent prostate cancer (PCa) treated with salvage radical prostatectomy (SRP) are poorly defined. Objective: To identify predictors of biochemical recurrence (BCR), metastasis, and death following SRP to help select patients who may benefit from SRP. Design, setting, and participants: This is a retrospective, international, multi-institutional cohort analysis. There was amedian follow-up of 4.4 yr following SRP performed on 404 men with radiation-recurrent PCa from 1985 to 2009 in tertiary centers. Intervention: Open SRP. Measurements: BCR after SRP was defined as a serum prostate-specific antigen (PSA) >= 0.1 or >= 0.2 ng/ml (depending on the institution). Secondary end points included progression to metastasis and cancerspecific death. Results and limitations: Median age at SRP was 65 yr of age, and median pre-SRP PSA was 4.5 ng/ml. Following SRP, 195 patients experienced BCR, 64 developed metastases, and 40 died from PCa. At 10 yr after SRP, BCR-free survival, metastasis-free survival, and cancer-specific survival (CSS) probabilities were 37% (95% confidence interval [CI], 31-43), 77% (95% CI, 71-82), and 83% (95% CI, 76-88), respectively. On preoperative multivariable analysis, pre-SRP PSA and Gleason score at postradiation prostate biopsy predicted BCR (p = 0.022; global p < 0.001) and metastasis (p = 0.022; global p < 0.001). On postoperative multivariable analysis, pre-SRP PSA and pathologic Gleason score at SRP predicted BCR (p = 0.014; global p < 0.001) and metastasis (p < 0.001; global p < 0.001). Lymph node involvement (LNI) also predicted metastasis (p = 0.017). The main limitations of this study are its retrospective design and the follow-up period. Conclusions: In a select group of patients who underwent SRP for radiation-recurrent PCa, freedom from clinical metastasis was observed in > 75% of patients 10 yr after surgery. Patients with lower pre-SRP PSA levels and lower postradiation prostate biopsy Gleason score have the highest probability of cure from SRP. (C) 2011 European Association of Urology. Published by Elsevier B. V. All rights reserved.
Resumo:
This paper presents an agent-based simulator designed for analyzing agent market strategies based on a complete understanding of buyer and seller behaviours, preference models and pricing algorithms, considering user risk preferences. The system includes agents that are capable of improving their performance with their own experience, by adapting to the market conditions. In the simulated market agents interact in several different ways and may joint together to form coalitions. In this paper we address multi-agent coalitions to analyse Distributed Generation in Electricity Markets
Resumo:
This paper describes a Multi-agent Scheduling System that assumes the existence of several Machines Agents (which are decision-making entities) distributed inside the Manufacturing System that interact and cooperate with other agents in order to obtain optimal or near-optimal global performances. Agents have to manage their internal behaviors and their relationships with other agents via cooperative negotiation in accordance with business policies defined by the user manager. Some Multi Agent Systems (MAS) organizational aspects are considered. An original Cooperation Mechanism for a Team-work based Architecture is proposed to address dynamic scheduling using Meta-Heuristics.
Resumo:
Tese de Doutoramento, Ciências do Mar (Biologia Marinha)
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores - Área de Especialização de Telecomunicações
Resumo:
The current ubiquitous network access and increase in network bandwidth are driving the sales of mobile location-aware user devices and, consequently, the development of context-aware applications, namely location-based services. The goal of this project is to provide consumers of location-based services with a richer end-user experience by means of service composition, personalization, device adaptation and continuity of service. Our approach relies on a multi-agent system composed of proxy agents that act as mediators and providers of personalization meta-services, device adaptation and continuity of service for consumers of pre-existing location-based services. These proxy agents, which have Web services interfaces to ensure a high level of interoperability, perform service composition and take in consideration the preferences of the users, the limitations of the user devices, making the usage of different types of devices seamless for the end-user. To validate and evaluate the performance of this approach, use cases were defined, tests were conducted and results gathered which demonstrated that the initial goals were successfully fulfilled.
Resumo:
4th International Conference, SIMPAR 2014, Bergamo, Italy, October 20-23, 2014
Resumo:
The paper presents a multi-robot cooperative framework to estimate the 3D position of dynamic targets, based on bearing-only vision measurements. The uncertainty of the observation provided by each robot equipped with a bearing-only vision system is effectively addressed for cooperative triangulation purposes by weighing the contribution of each monocular bearing ray in a probabilistic manner. The envisioned framework is evaluated in an outdoor scenario with a team of heterogeneous robots composed of an Unmanned Ground and Aerial Vehicle.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica
Resumo:
Traffic Engineering (TE) approaches are increasingly impor- tant in network management to allow an optimized configuration and resource allocation. In link-state routing, the task of setting appropriate weights to the links is both an important and a challenging optimization task. A number of different approaches has been put forward towards this aim, including the successful use of Evolutionary Algorithms (EAs). In this context, this work addresses the evaluation of three distinct EAs, a single and two multi-objective EAs, in two tasks related to weight setting optimization towards optimal intra-domain routing, knowing the network topology and aggregated traffic demands and seeking to mini- mize network congestion. In both tasks, the optimization considers sce- narios where there is a dynamic alteration in the state of the system, in the first considering changes in the traffic demand matrices and in the latter considering the possibility of link failures. The methods will, thus, need to simultaneously optimize for both conditions, the normal and the altered one, following a preventive TE approach towards robust configurations. Since this can be formulated as a bi-objective function, the use of multi-objective EAs, such as SPEA2 and NSGA-II, came nat- urally, being those compared to a single-objective EA. The results show a remarkable behavior of NSGA-II in all proposed tasks scaling well for harder instances, and thus presenting itself as the most promising option for TE in these scenarios.
Resumo:
We consider a principal who deals with a privately informed agent protected by limited liability in a correlated information setting. The agent's technology is such that the fixed cost declines with the marginal cost (the type), so that countervailing incentives may arise. We show that, with high liability, the first-best outcome can be effected for any type if (1) the fixed cost is non-concave in type, under the contract that yields the smallest feasible loss to the agent; (2) the fixed cost is not very concave in type, under the contract that yields the maximum sustainable loss to the agent. We further show that, with low liability, the first-best outcome is still implemented for a non-degenerate range of types if the fixed cost is less concave in type than some given threshold, which tightens as the liability reduces. The optimal contract entails pooling otherwise.
Resumo:
Nowadays, service providers in the Cloud offer complex services ready to be used as it was a commodity like water or electricity to their customers with any other extra effort for them. However, providing these services implies a high management effort which requires a lot of human interaction. Furthermore, an efficient resource management mechanism considering only provider's resources is, though necessary, not enough, because the provider's profit is limited by the amount of resources it owns. Dynamically outsourcing resources to other providers in response to demand variation avoids this problem and makes the provider to get more profit. A key technology for achieving these goals is virtualization which facilitates provider's management and provides on-demand virtual environments, which are isolated and consolidated in order to achieve a better utilization of the provider's resources. Nevertheless, dealing with some virtualization capabilities implies an effort for the user in order to take benefit from them. In order to avoid this problem, we are contributing the research community with a virtualized environment manager which aims to provide virtual machines that fulfils with the user requirements. Another challenge is sharing resources among different federated Cloud providers while exploiting the features of virtualization in a new approach for facilitating providers' management. This project aims for reducing provider's costs and at the same time fulfilling the quality of service agreed with the customers while maximizing the provider's revenue. It considers resource management at several layers, namely locally to each node in the provider, among different nodes in the provider, and among different federated providers. This latter layer supports the novel capabilities of outsourcing when the local resources are not enough to fulfil the users demand, and offering resources to other providers when the local resources are underused.
Resumo:
AbstractDigitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.
Resumo:
This service Aims: To provide a multi-component weight management service that supports sustainable behaviour change and weight loss in adults 16 years and over with a BMI 28. To enable patients to develop the necessary personal attributes for their own long term weight management and to understand the impact of their weight on their health and co-morbidities. Objectives: To provide an evidence based, multi-component tier 2 weight management service that improves patients knowledge and skills for effective and sustainable weight loss helps patients identify their own facilitators for positive behaviour change and to address underlying barriers to long-term behaviour changeincreases patients self-efficacy and confidence in their ability to address their weight To be an integral part of the tiered approach to weight management services for the population of Stockton. To ensure equitable service provision across Stockton-on-Tees. To provide intensive group based service, one-to-one support and maintenance support. To support the service user to develop and review a personalised goal setting plan phase 2 and at discharge after phase 2. To ensure a smooth transition from the service (tier2) to tier 1 services to ensure continuity of care for service users.Recruit referrals using a variety of and appropriate methods. To establish a single point of contact for referrals into the service.Continually promote the service across a range of mediums and liaise and work in partnership with key interdependencies (refer to 2.4) To establish a robust database and data collection system in line with information governance. To ensure the access criteria, care pathway and referral process is clearly understood by all health care professionals and those who may refer into the service. To establish close links with, and signpost and/or enable service users to access suitable services where patient needs indicate this. This may include access to Tees Time to Talk (IAPT) for psychological therapies; Specialist Weight Management Service; physical activity programmes; Tier 1 services; and primary care. To provide the necessary venues, equipment and assets needed to deliver the programme, ensuring due regard is given to the quality and safety of all materials used. To collect and provide data in quarterly reports to the Commissioner to allow for continued monitoring and evaluation of the service in line with the Standard Evaluation Framework (available at www.noo.org.uk/core/SEF) and as specified by the Commissioner.
Resumo:
NanoImpactNet (NIN) is a multidisciplinary European Commission funded network on the environmental, health and safety (EHS) impact of nanomaterials. The 24 founding scientific institutes are leading European research groups active in the fields of nanosafety, nanorisk assessment and nanotoxicology. This 4-year project is the new focal point for information exchange within the research community. Contact with other stakeholders is vital and their needs are being surveyed. NIN is communicating with 100s of stakeholders: businesses; internet platforms; industry associations; regulators; policy makers; national ministries; international agencies; standard-setting bodies and NGOs concerned by labour rights, EHS or animal welfare. To improve this communication, internet research, a questionnaire distributed via partners and targeted phone calls were used to identify stakeholders' interests and needs. Knowledge gaps and the necessity for further data mentioned by representatives of all stakeholder groups in the targeted phone calls concerned: • the potential toxic and safety hazards of nanomaterials throughout their lifecycles; • the fate and persistence of nanoparticles in humans, animals and the environment; • the associated risks of nanoparticle exposure; • greater participation in: the preparation of nomenclature, standards, methodologies, protocols and benchmarks; • the development of best practice guidelines; • voluntary schemes on responsibility; • databases of materials, research topics and themes, but also of expertise. These findings suggested that stakeholders and NIN researchers share very similar knowledge needs, and that open communication and free movement of knowledge will benefit both researchers and industry. Subsequently a workshop was organised by NIN focused on building a sustainable multi-stakeholder dialogue. Specific questions were asked to different stakeholder groups to encourage discussions and open communication. 1. What information do stakeholders need from researchers and why? The discussions about this question confirmed the needs identified in the targeted phone calls. 2. How to communicate information? While it was agreed that reporting should be enhanced, commercial confidentiality and economic competition were identified as major obstacles. It was recognised that expertise was needed in the areas of commercial law and economics for a wellinformed treatment of this communication issue. 3. Can engineered nanomaterials be used safely? The idea that nanomaterials are probably safe because some of them have been produced 'for a long time', was questioned, since many materials in common use have been proved to be unsafe. The question of safety is also about whether the public has confidence. New legislation like REACH could help with this issue. Hazards do not materialise if exposure can be avoided or at least significantly reduced. Thus, there is a need for information on what can be regarded as acceptable levels of exposure. Finally, it was noted that there is no such thing as a perfectly safe material but only boundaries. At this moment we do not know where these boundaries lie. The matter of labelling of products containing nanomaterials was raised, as in the public mind safety and labelling are connected. This may need to be addressed since the issue of nanomaterials in food, drink and food packaging may be the first safety issue to attract public and media attention, and this may have an impact on 'nanotechnology as a whole. 4. Do we need more or other regulation? Any decision making process should accommodate the changing level of uncertainty. To address the uncertainties, adaptations of frameworks such as REACH may be indicated for nanomaterials. Regulation is often needed even if voluntary measures are welcome because it mitigates the effects of competition between industries. Data cannot be collected on voluntary bases for example. NIN will continue with an active stakeholder dialogue to further build on interdisciplinary relationships towards a healthy future with nanotechnology.