985 resultados para Adaptive negotiation agents
Resumo:
Climate change may pose challenges and opportunities to viticulture, and much research has focused in studying the likely impacts on grapes and wine production in different regions worldwide. This study assesses the vulnerability and adaptive capacity of the viticulture sector under changing climate conditions, based on a case study in El Penedès region, Catalonia. Farm assets, livelihood strategies, farmer-market interactions and climate changes perceptions are analysed through semi-structured interviews with different types of wineries and growers. Both types of actors are equally exposed to biophysical stressors but unevenly affected by socioeconomic changes. While wineries are vulnerable because of the current economic crisis and the lack of diversification of their work, which may affect their income or production, growers are mainly affected by the low prices of their products and the lack of fix contracts. These socioeconomic stressors strongly condition their capacity to adapt to climate change, meaning that growers prioritize their immediate income problems, rather than future socioeconomic or climate threats. Therefore, growers undertake reactive adaptation to climate changing conditions, mainly based on ancient knowledge, whilst wineries combine both reactive and anticipatory adaptation practices. These circumstances should be addressed in order to allow better anticipatory adaptation to be implemented, thus avoiding future climate threats.
Resumo:
Helping behavior is any intentional behavior that benefits another living being or group (Hogg & Vaughan, 2010). People tend to underestimate the probability that others will comply with their direct requests for help (Flynn & Lake, 2008). This implies that when they need help, they will assess the probability of getting it (De Paulo, 1982, cited in Flynn & Lake, 2008) and then they will tend to estimate one that is actually lower than the real chance, so they may not even consider worth asking for it. Existing explanations for this phenomenon attribute it to a mistaken cost computation by the help seeker, who will emphasize the instrumental cost of “saying yes”, ignoring that the potential helper also needs to take into account the social cost of saying “no”. And the truth is that, especially in face-to-face interactions, the discomfort caused by refusing to help can be very high. In short, help seekers tend to fail to realize that it might be more costly to refuse to comply with a help request rather than accepting. A similar effect has been observed when estimating trustworthiness of people. Fetchenhauer and Dunning (2010) showed that people also tend to underestimate it. This bias is reduced when, instead of asymmetric feedback (getting feedback only when deciding to trust the other person), symmetric feedback (always given) was provided. This cause could as well be applicable to help seeking as people only receive feedback when they actually make their request but not otherwise. Fazio, Shook, and Eiser (2004) studied something that could be reinforcing these outcomes: Learning asymmetries. By means of a computer game called BeanFest, they showed that people learn better about negatively valenced objects (beans in this case) than about positively valenced ones. This learning asymmetry esteemed from “information gain being contingent on approach behavior” (p. 293), which could be identified with what Fetchenhauer and Dunning mention as ‘asymmetric feedback’, and hence also with help requests. Fazio et al. also found a generalization asymmetry in favor of negative attitudes versus positive ones. They attributed it to a negativity bias that “weights resemblance to a known negative more heavily than resemblance to a positive” (p. 300). Applied to help seeking scenarios, this would mean that when facing an unknown situation, people would tend to generalize and infer that is more likely that they get a negative rather than a positive outcome from it, so, along with what it was said before, people will be more inclined to think that they will get a “no” when requesting help. Denrell and Le Mens (2011) present a different perspective when trying to explain judgment biases in general. They deviate from the classical inappropriate information processing (depicted among other by Fiske & Taylor, 2007, and Tversky & Kahneman, 1974) and explain this in terms of ‘adaptive sampling’. Adaptive sampling is a sampling mechanism in which the selection of sample items is conditioned by the values of the variable of interest previously observed (Thompson, 2011). Sampling adaptively allows individuals to safeguard themselves from experiences they went through once and turned out to lay negative outcomes. However, it also prevents them from giving a second chance to those experiences to get an updated outcome that could maybe turn into a positive one, a more positive one, or just one that regresses to the mean, whatever direction that implies. That, as Denrell and Le Mens (2011) explained, makes sense: If you go to a restaurant, and you did not like the food, you do not choose that restaurant again. This is what we think could be happening when asking for help: When we get a “no”, we stop asking. And here, we want to provide a complementary explanation for the underestimation of the probability that others comply with our direct help requests based on adaptive sampling. First, we will develop and explain a model that represents the theory. Later on, we will test it empirically by means of experiments, and will elaborate on the analysis of its results.
Resumo:
Our work is focused on alleviating the workload for designers of adaptive courses on the complexity task of authoring adaptive learning designs adjusted to specific user characteristics and the user context. We propose an adaptation platform that consists in a set of intelligent agents where each agent carries out an independent adaptation task. The agents apply machine learning techniques to support the user modelling for the adaptation process
Resumo:
[Acte. 1759-01-19. Paris]
Resumo:
The efficacy and safety of anti-infective treatments are associated with the drug blood concentration profile, which is directly correlated with a dosing adjustment to the individual patient's condition. Dosing adjustments to the renal function recommended in reference books are often imprecise and infrequently applied in clinical practice. The recent generalisation of the KDOQI (Kidney Disease Outcome Quality Initiative) staging of chronically impaired renal function represents an opportunity to review and refine the dosing recommendations in patients with renal insufficiency. The literature has been reviewed and compared to a predictive model of the fraction of drug cleared by the kidney based on the Dettli's principle. Revised drug dosing recommendations integrating these predictive parameters are proposed.
Resumo:
Anti-strip agents can effect the temperature susceptibility of asphalt cement. This concern was expressed at the 33rd Annual Bituminous Conference in St. Paul, Minnesota by Mr. David Gendell, Director of Highway Operations. This study compares viscosity-temperature relationships of asphalt cement with and without anti-strip agent addition.
Resumo:
The state of the art to describe image quality in medical imaging is to assess the performance of an observer conducting a task of clinical interest. This can be done by using a model observer leading to a figure of merit such as the signal-to-noise ratio (SNR). Using the non-prewhitening (NPW) model observer, we objectively characterised the evolution of its figure of merit in various acquisition conditions. The NPW model observer usually requires the use of the modulation transfer function (MTF) as well as noise power spectra. However, although the computation of the MTF poses no problem when dealing with the traditional filtered back-projection (FBP) algorithm, this is not the case when using iterative reconstruction (IR) algorithms, such as adaptive statistical iterative reconstruction (ASIR) or model-based iterative reconstruction (MBIR). Given that the target transfer function (TTF) had already shown it could accurately express the system resolution even with non-linear algorithms, we decided to tune the NPW model observer, replacing the standard MTF by the TTF. It was estimated using a custom-made phantom containing cylindrical inserts surrounded by water. The contrast differences between the inserts and water were plotted for each acquisition condition. Then, mathematical transformations were performed leading to the TTF. As expected, the first results showed a dependency of the image contrast and noise levels on the TTF for both ASIR and MBIR. Moreover, FBP also proved to be dependent of the contrast and noise when using the lung kernel. Those results were then introduced in the NPW model observer. We observed an enhancement of SNR every time we switched from FBP to ASIR to MBIR. IR algorithms greatly improve image quality, especially in low-dose conditions. Based on our results, the use of MBIR could lead to further dose reduction in several clinical applications.
Resumo:
Anàlisi empírica de la imatge turística de la ciutat de Girona que transmeten els agents orgànics mitjançant els blogs d’Internet i, a partir d’aquí, deducció de quina és la imatge percebuda pels turistes potencials
Resumo:
Protease-sensitive macromolecular prodrugs have attracted interest for bio-responsive drug delivery to sites with up-regulated proteolytic activities such as inflammatory or cancerous lesions. Here we report the development of a novel polymeric photosensitizer prodrug (T-PS) to target thrombin, a protease up-regulated in synovial tissues of rheumatoid arthritis (RA) patients, for minimally invasive photodynamic synovectomy. In T-PS, multiple photosensitizer units are tethered to a polymeric backbone via short, thrombin-cleavable peptide linkers. Photoactivity of the prodrug is efficiently impaired due to energy transfer between neighbouring photosensitizer units. T-PS activation by exogenous and endogenous thrombin induced an increase in fluorescence emission by a factor of 16 after in vitro digestion and a selective fluorescence enhancement in arthritic lesions in vivo, in a collagen-induced arthritis mouse model. In vitro studies on primary human synoviocytes showed a phototoxic effect only after enzymatic digestion of the prodrug and light irradiation, thus demonstrating the functionality of T-PS induced PDT. The developed photosensitizer prodrugs combine the passive targeting capacity of macromolecular drug delivery systems with site-selective photosensitizer release and activation. They illuminate lesions with pathologically enhanced proteolytic activity and induce cell death, subsequent to irradiation.
Resumo:
With six targeted agents approved (sorafenib, sunitinib, temsirolimus, bevacizumab [+interferon], everolimus and pazopanib), many patients with metastatic renal cell carcinoma (mRCC) will receive multiple therapies. However, the optimum sequencing approach has not been defined. A group of European experts reviewed available data and shared their clinical experience to compile an expert agreement on the sequential use of targeted agents in mRCC. To date, there are few prospective studies of sequential therapy. The mammalian target of rapamycin (mTOR) inhibitor everolimus was approved for use in patients who failed treatment with inhibitors of vascular endothelial growth factor (VEGF) and VEGF receptors (VEGFR) based on the results from a Phase III placebo-controlled study; however, until then, the only licensed agents across the spectrum of mRCC were VEGF(R) inhibitors (sorafenib, sunitinib and bevacizumab + interferon), and as such, a large body of evidence has accumulated regarding their use in sequence. Data show that sequential use of VEGF(R) inhibitors may be an effective treatment strategy to achieve prolonged clinical benefit. The optimal place of each targeted agent in the treatment sequence is still unclear, and data from large prospective studies are needed. The Phase III AXIS study of second-line sorafenib vs. axitinib (including post-VEGF(R) inhibitors) has completed, but the data are not yet published; other ongoing studies include the Phase III SWITCH study of sorafenib-sunitinib vs. sunitinib-sorafenib (NCT00732914); the Phase III 404 study of temsirolimus vs. sorafenib post-sunitinib (NCT00474786) and the Phase II RECORD 3 study of sunitinib-everolimus vs. everolimus-sunitinib (NCT00903175). Until additional data are available, consideration of patient response and tolerability to treatment may facilitate current decision-making regarding when to switch and which treatment to switch to in real-life clinical practice.
Resumo:
In a previous work we have shown that sinusoidal whole-body rotations producing continuous vestibular stimulation, affected the timing of motor responses as assessed with a paced finger tapping (PFT) task (Binetti et al. (2010). Neuropsychologia, 48(6), 1842-1852). Here, in two new psychophysical experiments, one purely perceptual and one with both sensory and motor components, we explored the relationship between body motion/vestibular stimulation and perceived timing of acoustic events. In experiment 1, participants were required to discriminate sequences of acoustic tones endowed with different degrees of acceleration or deceleration. In this experiment we found that a tone sequence presented during acceleratory whole-body rotations required a progressive increase in rate in order to be considered temporally regular, consistent with the idea of an increase in "clock" frequency and of an overestimation of time. In experiment 2 participants produced self-paced taps, which entailed an acoustic feedback. We found that tapping frequency in this task was affected by periodic motion by means of anticipatory and congruent (in-phase) fluctuations irrespective of the self-generated sensory feedback. On the other hand, synchronizing taps to an external rhythm determined a completely opposite modulation (delayed/counter-phase). Overall this study shows that body displacements "remap" our metric of time, affecting not only motor output but also sensory input.
Resumo:
A workshop recently held at the Ecole Polytechnique Federale de Lausanne (EPFL, Switzerland) was dedicated to understanding the genetic basis of adaptive change, taking stock of the different approaches developed in theoretical population genetics and landscape genomics and bringing together knowledge accumulated in both research fields. Indeed, an important challenge in theoretical population genetics is to incorporate effects of demographic history and population structure. But important design problems (e.g. focus on populations as units, focus on hard selective sweeps, no hypothesis-based framework in the design of the statistical tests) reduce their capability of detecting adaptive genetic variation. In parallel, landscape genomics offers a solution to several of these problems and provides a number of advantages (e.g. fast computation, landscape heterogeneity integration). But the approach makes several implicit assumptions that should be carefully considered (e.g. selection has had enough time to create a functional relationship between the allele distribution and the environmental variable, or this functional relationship is assumed to be constant). To address the respective strengths and weaknesses mentioned above, the workshop brought together a panel of experts from both disciplines to present their work and discuss the relevance of combining these approaches, possibly resulting in a joint software solution in the future.