978 resultados para multi-issue bargaining


Relevância:

30.00% 30.00%

Publicador:

Resumo:

20 years after the discovery of the first planets outside our solar system, the current exoplanetary population includes more than 700 confirmed planets around main sequence stars. Approximately 50% belong to multiple-planet systems in very diverse dynamical configurations, from two-planet hierarchical systems to multiple resonances that could only have been attained as the consequence of a smooth large-scale orbital migration. The first part of this paper reviews the main detection techniques employed for the detection and orbital characterization of multiple-planet systems, from the (now) classical radial velocity (RV) method to the use of transit time variations (TTV) for the identification of additional planetary bodies orbiting the same star. In the second part we discuss the dynamical evolution of multi-planet systems due to their mutual gravitational interactions. We analyze possible modes of motion for hierarchical, secular or resonant configurations, and what stability criteria can be defined in each case. In some cases, the dynamics can be well approximated by simple analytical expressions for the Hamiltonian function, while other configurations can only be studied with semi-analytical or numerical tools. In particular, we show how mean-motion resonances can generate complex structures in the phase space where different libration islands and circulation domains are separated by chaotic layers. In all cases we use real exoplanetary systems as working examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current SoC design trends are characterized by the integration of larger amount of IPs targeting a wide range of application fields. Such multi-application systems are constrained by a set of requirements. In such scenario network-on-chips (NoC) are becoming more important as the on-chip communication structure. Designing an optimal NoC for satisfying the requirements of each individual application requires the specification of a large set of configuration parameters leading to a wide solution space. It has been shown that IP mapping is one of the most critical parameters in NoC design, strongly influencing the SoC performance. IP mapping has been solved for single application systems using single and multi-objective optimization algorithms. In this paper we propose the use of a multi-objective adaptive immune algorithm (M(2)AIA), an evolutionary approach to solve the multi-application NoC mapping problem. Latency and power consumption were adopted as the target multi-objective functions. To compare the efficiency of our approach, our results are compared with those of the genetic and branch and bound multi-objective mapping algorithms. We tested 11 well-known benchmarks, including random and real applications, and combines up to 8 applications at the same SoC. The experimental results showed that the M(2)AIA decreases in average the power consumption and the latency 27.3 and 42.1 % compared to the branch and bound approach and 29.3 and 36.1 % over the genetic approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that the deposition of gaseous pollutants and aerosols plays a major role in causing the deterioration of monuments and built cultural heritage in European cities. Despite of many studies dedicated to the environmental damage of cultural heritage, in case of cement mortars, commonly used in the 20th century architecture, the deterioration due to air multipollutants impact, especially the formation of black crusts, is still not well explored making this issue a challenging area of research. This work centers on cement mortars – environment interactions, focusing on the diagnosis of the damage on the modern built heritage due to air multi-pollutants. For this purpose three sites, exposed to different urban areas in Europe, were selected for sampling and subsequent laboratory analyses: Centennial Hall, Wroclaw (Poland), Chiesa dell'Autostrada del Sole, Florence (Italy), Casa Galleria Vichi, Florence (Italy). The sampling sessions were performed taking into account the height from the ground level and protection from rain run off (sheltered, partly sheltered and exposed areas). The complete characterization of collected damage layer and underlying materials was performed using a range of analytical techniques: optical and scanning electron microscopy, X ray diffractometry, differential and gravimetric thermal analysis, ion chromatography, flash combustion/gas chromatographic analysis, inductively coupled plasma-optical emission spectrometer. The data were elaborated using statistical methods (i.e. principal components analyses) and enrichment factor for cement mortars was calculated for the first time. The results obtained from the experimental activity performed on the damage layers indicate that gypsum, due to the deposition of atmospheric sulphur compounds, is the main damage product at surfaces sheltered from rain run-off at Centennial Hall and Casa Galleria Vichi. By contrast, gypsum has not been identified in the samples collected at Chiesa dell'Autostrada del Sole. This is connected to the restoration works, particularly surface cleaning, regularly performed for the maintenance of the building. Moreover, the results obtained demonstrated the correlation between the location of the building and the composition of the damage layer: Centennial Hall is mainly undergoing to the impact of pollutants emitted from the close coal power stations, whilst Casa Galleria Vichi is principally affected by pollutants from vehicular exhaust in front of the building.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with distributed control strategies for cooperative control of multi-robot systems. Specifically, distributed coordination strategies are presented for groups of mobile robots. The formation control problem is initially solved exploiting artificial potential fields. The purpose of the presented formation control algorithm is to drive a group of mobile robots to create a completely arbitrarily shaped formation. Robots are initially controlled to create a regular polygon formation. A bijective coordinate transformation is then exploited to extend the scope of this strategy, to obtain arbitrarily shaped formations. For this purpose, artificial potential fields are specifically designed, and robots are driven to follow their negative gradient. Artificial potential fields are then subsequently exploited to solve the coordinated path tracking problem, thus making the robots autonomously spread along predefined paths, and move along them in a coordinated way. Formation control problem is then solved exploiting a consensus based approach. Specifically, weighted graphs are used both to define the desired formation, and to implement collision avoidance. As expected for consensus based algorithms, this control strategy is experimentally shown to be robust to the presence of communication delays. The global connectivity maintenance issue is then considered. Specifically, an estimation procedure is introduced to allow each agent to compute its own estimate of the algebraic connectivity of the communication graph, in a distributed manner. This estimate is then exploited to develop a gradient based control strategy that ensures that the communication graph remains connected, as the system evolves. The proposed control strategy is developed initially for single-integrator kinematic agents, and is then extended to Lagrangian dynamical systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theme “Belongingness” has a central role in the current debate on Community Psychology and in daily life. To try to understand the consequences of these changes, the research focuses on the concept of Sense of Community. In fact, Sense of Community has always been a central tool (Sarason, 1974; MacMillan & Chavis, 1986) to study communities (McMillan, 2011; Nowell & Boyd, 2011) and for a long time has represented positive values and something to aspire to. However, current debates indicate that Sense of Community is an outmoded concept and underline the problematic issue of “promotion of Sense of Community” in contexts of multi culture. The aim of the present research is to analyze Sense of Community in context of multi culture, as we consider that it can still be a fundamental tool to study and understand communities. In particular we are interested in understanding the role of Multiple Sense of Community (Brodsky, 2009) on Identity and Wellbeing (and its dimensions). We focused on a specific context, the Station Zone in Reggio Emilia, that is characterized by high levels of cultural diversity and different social problems (Giovannini & Vezzali, 2011). The research is developed and divided into two parts. The first part consists of an exploratory qualitative study that analyzes meanings of community among leaders of different ethnic groups living in the Station Zone. The second part consists of a “General Model” study and four parallel studies to analyze Multiple Sense of Community in different ethnic groups (Albanians, Moroccans, Chinese and Italians. The results indicate the different role of Multiple SOC in the relation between Identity and Wellbeing, in particular the relevance of Relational SOC and its different implications. Moreover, the factor “culture” represents an significant element in order to consider differences among ethnic groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern imaging technologies, such as computed tomography (CT) techniques, represent a great challenge in forensic pathology. The field of forensics has experienced a rapid increase in the use of these new techniques to support investigations on critical cases, as indicated by the implementation of CT scanning by different forensic institutions worldwide. Advances in CT imaging techniques over the past few decades have finally led some authors to propose that virtual autopsy, a radiological method applied to post-mortem analysis, is a reliable alternative to traditional autopsy, at least in certain cases. The authors investigate the occurrence and the causes of errors and mistakes in diagnostic imaging applied to virtual autopsy. A case of suicide by a gunshot wound was submitted to full-body CT scanning before autopsy. We compared the first examination of sectional images with the autopsy findings and found a preliminary misdiagnosis in detecting a peritoneal lesion by gunshot wound that was due to radiologist's error. Then we discuss a new emerging issue related to the risk of diagnostic failure in virtual autopsy due to radiologist's error that is similar to what occurs in clinical radiology practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Entire issue (large pdf file) Articles include: What Have We Learned from Articles Published in the Family Preservation Journal? Michael J. Holosko, and D. ann Holosko Family Reunion services: An Examination of a Process Used to Successfully Reunite Families. Lois Pierce and Vince Geremia A Multi-Dimensional Approach to Evaluating Family Preservation Programs. Cynthia A. Ford, and Felix A. Okojie Ordinary Families -- Extraordinary Care Giving. John P. Ronnau

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Entire issue (large pdf file) Articles include: A Model for Family Preservation Case Assessment. Kam-fong Monit Cheung, Patrick Leung and Sharon Alpert Behavioral Outcomes of Home-Based Services for Children and Adolescents with Serious Emotional Disorders. Edwin Morris, Lourdes Suarez, John C. Reid A Multi-Faceted, Intensive Family Preservation Program Evaluation. Michael Raschick Targeting Families to Receive Intensive Family Preservation Services: Assessing the Use of Imminent Risk of Placement as a Service Criterion. Elaine Walton and Ramona W. Denby

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the issue of fully automatic segmentation of a hip CT image with the goal to preserve the joint structure for clinical applications in hip disease diagnosis and treatment. For this purpose, we propose a Multi-Atlas Segmentation Constrained Graph (MASCG) method. The MASCG method uses multi-atlas based mesh fusion results to initialize a bone sheetness based multi-label graph cut for an accurate hip CT segmentation which has the inherent advantage of automatic separation of the pelvic region from the bilateral proximal femoral regions. We then introduce a graph cut constrained graph search algorithm to further improve the segmentation accuracy around the bilateral hip joint regions. Taking manual segmentation as the ground truth, we evaluated the present approach on 30 hip CT images (60 hips) with a 15-fold cross validation. When the present approach was compared to manual segmentation, an average surface distance error of 0.30 mm, 0.29 mm, and 0.30 mm was found for the pelvis, the left proximal femur, and the right proximal femur, respectively. A further look at the bilateral hip joint regions demonstrated an average surface distance error of 0.16 mm, 0.21 mm and 0.20 mm for the acetabulum, the left femoral head, and the right femoral head, respectively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spatial and temporal dynamics of seagrasses have been studied from the leaf to patch (100 m**2) scales. However, landscape scale (> 100 km**2) seagrass population dynamics are unresolved in seagrass ecology. Previous remote sensing approaches have lacked the temporal or spatial resolution, or ecologically appropriate mapping, to fully address this issue. This paper presents a robust, semi-automated object-based image analysis approach for mapping dominant seagrass species, percentage cover and above ground biomass using a time series of field data and coincident high spatial resolution satellite imagery. The study area was a 142 km**2 shallow, clear water seagrass habitat (the Eastern Banks, Moreton Bay, Australia). Nine data sets acquired between 2004 and 2013 were used to create seagrass species and percentage cover maps through the integration of seagrass photo transect field data, and atmospherically and geometrically corrected high spatial resolution satellite image data (WorldView-2, IKONOS and Quickbird-2) using an object based image analysis approach. Biomass maps were derived using empirical models trained with in-situ above ground biomass data per seagrass species. Maps and summary plots identified inter- and intra-annual variation of seagrass species composition, percentage cover level and above ground biomass. The methods provide a rigorous approach for field and image data collection and pre-processing, a semi-automated approach to extract seagrass species and cover maps and assess accuracy, and the subsequent empirical modelling of seagrass biomass. The resultant maps provide a fundamental data set for understanding landscape scale seagrass dynamics in a shallow water environment. Our findings provide proof of concept for the use of time-series analysis of remotely sensed seagrass products for use in seagrass ecology and management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article presents a multi-agent expert system (SMAF) , that allows the input of incidents which occur in different elements of the telecommunications area. SMAF interacts with experts and general users, and each agent with all the agents? community, recording the incidents and their solutions in a knowledge base, without the analysis of their causes. The incidents are expressed using keywords taken from natural language (originally Spanish) and their main concepts are recorded with their severities as the users express them. Then, there is a search of the best solution for each incident, being helped by a human operator using a distancenotions between them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed parallel execution systems speed up applications by splitting tasks into processes whose execution is assigned to different receiving nodes in a high-bandwidth network. On the distributing side, a fundamental problem is grouping and scheduling such tasks such that each one involves sufñcient computational cost when compared to the task creation and communication costs and other such practical overheads. On the receiving side, an important issue is to have some assurance of the correctness and characteristics of the code received and also of the kind of load the particular task is going to pose, which can be specified by means of certificates. In this paper we present in a tutorial way a number of general solutions to these problems, and illustrate them through their implementation in the Ciao multi-paradigm language and program development environment. This system includes facilities for parallel and distributed execution, an assertion language for specifying complex programs properties (including safety and resource-related properties), and compile-time and run-time tools for performing automated parallelization and resource control, as well as certification of programs with resource consumption assurances and efñcient checking of such certificates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One major problem of concurrent multi-path transfer (CMT) scheme in multi-homed mobile networks is that the utilization of different paths with diverse delays may cause packet reordering among packets of the same ?ow. In the case of TCP-like, the reordering exacerbates the problem by bringing more timeouts and unnecessary retransmissions, which eventually degrades the throughput of connections considerably. To address this issue, we ?rst propose an Out-of-order Scheduling for In-order Arriving (OSIA), which exploits the sending time discrepancy to preserve the in-order packet arrival. Then, we formulate the optimal traf?c scheduling as a constrained optimization problem and derive its closedform solution by our proposed progressive water-?lling solution. We also present an implementation to enforce the optimal scheduling scheme using cascaded leaky buckets with multiple faucets, which provides simple guidelines on maximizing the utilization of aggregate bandwidth while decreasing the probability of triggering 3 dupACKs. Compared with previous work, the proposed scheme has lower computation complexity and can also provide the possibility for dynamic network adaptability and ?ner-grain load balancing. Simulation results show that our scheme signi?cantly alleviates reordering and enhances transmission performance.