991 resultados para Observed information
Resumo:
The aim of this thesis was to investigate the respective contribution of prior information and sensorimotor constraints to action understanding, and to estimate their consequences on the evolution of human social learning. Even though a huge amount of literature is dedicated to the study of action understanding and its role in social learning, these issues are still largely debated. Here, I critically describe two main perspectives. The first perspective interprets faithful social learning as an outcome of a fine-grained representation of others’ actions and intentions that requires sophisticated socio-cognitive skills. In contrast, the second perspective highlights the role of simpler decision heuristics, the recruitment of which is determined by individual and ecological constraints. The present thesis aims to show, through four experimental works, that these two contributions are not mutually exclusive. A first study investigates the role of the inferior frontal cortex (IFC), the anterior intraparietal area (AIP) and the primary somatosensory cortex (S1) in the recognition of other people’s actions, using a transcranial magnetic stimulation adaptation paradigm (TMSA). The second work studies whether, and how, higher-order and lower-order prior information (acquired from the probabilistic sampling of past events vs. derived from an estimation of biomechanical constraints of observed actions) interacts during the prediction of other people’s intentions. Using a single-pulse TMS procedure, the third study investigates whether the interaction between these two classes of priors modulates the motor system activity. The fourth study tests the extent to which behavioral and ecological constraints influence the emergence of faithful social learning strategies at a population level. The collected data contribute to elucidate how higher-order and lower-order prior expectations interact during action prediction, and clarify the neural mechanisms underlying such interaction. Finally, these works provide/open promising perspectives for a better understanding of social learning, with possible extensions to animal models.
Resumo:
Holding the major share of stellar mass in galaxies and being also old and passively evolving, early-type galaxies (ETGs) are the primary probes in investigating these various evolution scenarios, as well as being useful means to provide insights on cosmological parameters. In this thesis work I focused specifically on ETGs and on their capability in constraining galaxy formation and evolution; in particular, the principal aims were to derive some of the ETGs evolutionary parameters, such as age, metallicity and star formation history (SFH) and to study their age-redshift and mass-age relations. In order to infer galaxy physical parameters, I used the public code STARLIGHT: this program provides a best fit to the observed spectrum from a combination of many theoretical models defined in user-made libraries. the comparison between the output and input light-weighted ages shows a good agreement starting from SNRs of ∼ 10, with a bias of ∼ 2.2% and a dispersion 3%. Furthermore, also metallicities and SFHs are well reproduced. In the second part of the thesis I performed an analysis on real data, starting from Sloan Digital Sky Survey (SDSS) spectra. I found that galaxies get older with cosmic time and with increasing mass (for a fixed redshift bin); absolute light-weighted ages, instead, result independent from the fitting parameters or the synthetic models used. Metallicities, instead, are very similar from each other and clearly consistent with the ones derived from the Lick indices. The predicted SFH indicates the presence of a double burst of star formation. Velocity dispersions and extinctiona are also well constrained, following the expected behaviours. As a further step, I also fitted single SDSS spectra (with SNR∼ 20), to verify that stacked spectra gave the same results without introducing any bias: this is an important check, if one wants to apply the method at higher z, where stacked spectra are necessary to increase the SNR. Our upcoming aim is to adopt this approach also on galaxy spectra obtained from higher redshift Surveys, such as BOSS (z ∼ 0.5), zCOSMOS (z 1), K20 (z ∼ 1), GMASS (z ∼ 1.5) and, eventually, Euclid (z 2). Indeed, I am currently carrying on a preliminary study to estabilish the applicability of the method to lower resolution, as well as higher redshift (z 2) spectra, just like the Euclid ones.
Resumo:
Bei der vorliegenden Studie wurde die Machbarkeit und Qualität der Arzneimittelverteilung von oralen Arzneimitteln in Einzeldosisblisterverpackungen je abgeteilte Arzneiform (EVA) untersucht.rnDie Studie wurde als offene, vergleichende, prospektive und multizentrische Patientenstudie durchgeführt. Als Studienmedikation standen Diovan®, CoDiovan® und Amlodipin in der EVA-Verpackung zur Verfügung. Die Verteilfehlerrate in der EVA- und Kontroll-Gruppe stellte den primären Zielparameter dar. Das Patientenwissen, die Patientenzufriedenheit und die Praktikabilität des EVA-Systems, sowie die Zufriedenheit der Pflegekräfte wurden mithilfe von Fragebogen evaluiert. Insgesamt wurden 2070 gültige Tablettenvergaben bei 332 Patienten in sechs verschiedenen Krankenhäusern geprüft. Es wurde in der EVA-Gruppe ein Verteilungsfehler von 1,8% und in der Kontroll-Gruppe von 0,7% ermittelt. Bei den Patienten-Fragebogen konnten insgesamt 292 Fragebogen ausgewertet werden. Die Ergebnisse zeigten einen ungenügenden Informationsstand der Patienten über ihre aktuellen, oralen Arzneimittel. In den 80 ausgefüllten Pflegekräfte-Fragebogen gaben über 80% an, dass Fehler beim Richten durch das EVA-System besser erkannt werden können. rnZusammenfassend kann gesagt werden, dass die erhöhte Fehlerrate in der EVA-Gruppe im Vergleich zur Kontroll-Gruppe durch mehrere Störfaktoren bedingt wurde. Grundsätzlich konnte eine sehr positive Resonanz auf das EVA-System bei den Patienten und den Pflegekräften beobachtet werden. rn
Resumo:
Many metabolites in the proton magnetic resonance spectrum undergo magnetization exchange with water, such as those in the downfield region (6.0-8.5 ppm) and the upfield peaks of creatine, which can be measured to reveal additional information about the molecular environment. In addition, these resonances are attenuated by conventional water suppression techniques complicating detection and quantification. To characterize these metabolites in human skeletal muscle in vivo at 3 T, metabolite cycled non-water-suppressed spectroscopy was used to conduct a water inversion transfer experiment in both the soleus and tibialis anterior muscles. Resulting median exchange-independent T(1) times for the creatine methylene resonances were 1.26 and 1.15 s, and for the methyl resonances were 1.57 and 1.74 s, for soleus and tibialis anterior muscles, respectively. Magnetization transfer rates from water to the creatine methylene resonances were 0.56 and 0.28 s(-1) , and for the methyl resonances were 0.39 and 0.30 s(-1) , with the soleus exhibiting faster transfer rates for both resonances, allowing speculation about possible influences of either muscle fibre orientation or muscle composition on the magnetization transfer process. These water magnetization transfer rates observed without water suppression are in good agreement with earlier reports that used either postexcitation water suppression in rats, or short CHESS sequences in human brain and skeletal muscle.
Resumo:
Background Public information about prevention of zoonoses should be based on the perceived problem by the public and should be adapted to regional circumstances. Growing fox populations have led to increasing concern about human alveolar echinococcosis, which is caused by the fox tapeworm Echinococcus multilocularis. In order to plan information campaigns, public knowledge about this zoonotic tapeworm was assessed. Methods By means of representative telephone interviews (N = 2041), a survey of public knowledge about the risk and the prevention of alveolar echinococcosis was carried out in the Czech Republic, France, Germany and Switzerland in 2004. Results For all five questions, significant country-specific differences were found. Fewer people had heard of E. multilocularis in the Czech Republic (14%) and France (18%) compared to Germany (63%) and Switzerland (70%). The same effect has been observed when only high endemic regions were considered (Czech Republic: 20%, France: 17%, Germany: 77%, Switzerland: 61%). In France 17% of people who knew the parasite felt themselves reasonably informed. In the other countries, the majority felt themselves reasonably informed (54–60%). The percentage that perceived E. multilocularis as a high risk ranged from 12% (Switzerland) to 43% (France). In some countries promising measures as deworming dogs (Czech Republic, Switzerland) were not recognized as prevention options. Conclusion Our results and the actual epidemiological circumstances of AE call for proactive information programs. This communication should enable the public to achieve realistic risk perception, give clear information on how people can minimize their infection risk, and prevent exaggerated reactions and anxiety.
Resumo:
Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.
Resumo:
We investigated attention, encoding and processing of social aspects of complex photographic scenes. Twenty-four high-functioning adolescents (aged 11–16) with ASD and 24 typically developing matched control participants viewed and then described a series of scenes, each containing a person. Analyses of eye movements and verbal descriptions provided converging evidence that both groups displayed general interest in the person in each scene but the salience of the person was reduced for the ASD participants. Nevertheless, the verbal descriptions revealed that participants with ASD frequently processed the observed person’s emotion or mental state without prompting. They also often mentioned eye-gaze direction, and there was evidence from eye movements and verbal descriptions that gaze was followed accurately. The combination of evidence from eye movements and verbal descriptions provides a rich insight into the way stimuli are processed overall. The merits of using these methods within the same paradigm are discussed.
Resumo:
BACKGROUND Despite the chronic and relapsing nature of inflammatory bowel diseases (IBD), at least 30% to 45% of the patients are noncompliant to treatment. IBD patients often seek information about their disease. AIM To examine the association between information-seeking activity and treatment compliance among IBD patients. To compare information sources and concerns between compliant and noncompliant patients. METHODS We used data from the Swiss IBD cohort study, and from a qualitative survey conducted to assess information sources and concerns. Crude and adjusted odds ratios (OR) for noncompliance were calculated. Differences in the proportions of information sources and concerns were compared between compliant and noncompliant patients. RESULTS A total of 512 patients were included. About 18% (n = 99) of patients were reported to be noncompliant to drug treatment and two-thirds (n = 353) were information seekers. The OR for noncompliance among information seekers was 2.44 (95%CI: 1.34-4.41) after adjustment for confounders and major risk factors. General practitioners were 15.2% more often consulted (p = 0.019) among compliant patients, as were books and television (+13.1%; p = 0.048), whereas no difference in proportions was observed for sources such as internet or gastroenterologists. Information on tips for disease management were 14.2% more often sought among noncompliant patients (p = 0.028). No difference was observed for concerns on research and development on IBD or therapies. CONCLUSION In Switzerland, IBD patients noncompliant to treatment were more often seeking disease-related information than compliant patients. Daily management of symptoms and disease seemed to be an important concern of those patients.
Resumo:
We present an application and sample independent method for the automatic discrimination of noise and signal in optical coherence tomography Bscans. The proposed algorithm models the observed noise probabilistically and allows for a dynamic determination of image noise parameters and the choice of appropriate image rendering parameters. This overcomes the observer variability and the need for a priori information about the content of sample images, both of which are challenging to estimate systematically with current systems. As such, our approach has the advantage of automatically determining crucial parameters for evaluating rendered image quality in a systematic and task independent way. We tested our algorithm on data from four different biological and nonbiological samples (index finger, lemon slices, sticky tape, and detector cards) acquired with three different experimental spectral domain optical coherence tomography (OCT) measurement systems including a swept source OCT. The results are compared to parameters determined manually by four experienced OCT users. Overall, our algorithm works reliably regardless of which system and sample are used and estimates noise parameters in all cases within the confidence interval of those found by observers.
Resumo:
Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.
Resumo:
Grain legume production in Europe has decreased in recent years, while legume demand has rapidly increased due to growth of meat production. Therefore, Europe imports grain legumes, principally soybeans, to meet feed protein requirements. Various investigations have identified problems and benefits of local grain legume cultivation. Nevertheless, grain legume cultivation has still not increased in the last years. Studies investigating why farmers do not cultivate grain legumes are missing. Here, we surveyed the knowledge of farmers about grain legume cultivation, problems and constraints of grain legume cultivation and the barriers faced by and incentives needed by farmers. We sent a questionnaire to 1373 farmers in Luxembourg, with a response rate of 29 %. Results show that only 17 % of all the responding farmers cultivated grain legumes; 88 % of the conventional farmers did not cultivate grain legumes, while 85 % of the organic farmers did. We observed that Luxembourgish farmers feel badly informed about grain legume cultivation; organic farmers generally feel better informed than their conventional colleagues. The main barrier, named by Luxemburgish farmers to not cultivate grain legumes, is not economic issues but a lack of knowledge and extension services for these crops. Main incentives needed to start grain legume cultivation in the future are economic issues. Even though grain legume producers mentioned several negative experiences with grain legume cultivation, they are not discouraged by the poor economic conditions and appreciate the benefits of grain legume cultivation. Overall, our findings show that research results on grain legume should be better disseminated to extension services and farmers.
Resumo:
We address ethical consumption using a natural field experiment on the actual purchase of Fair Trade (FT) coffee in three supermarkets in Germany. Based on a quasi-experimental before-and-after design the effects of three different treatments – information, 20% price reduction, and a moral appeal – are analyzed. Sales data cover actual ethical purchase behavior and avoid problems of social desirability. But they offer only limited insights into the motivations of individual consumers. We therefore complemented the field experiment with a customer survey that allows us to contrast observed (ethical) buying behavior with self-reported FT consumption. Results from the experiment suggest that only the price reduction had the expected positive and statistically significant effect on FT consumption.
Resumo:
The aim of this study was to determine cancer mortality rates for the United Arab Emirates (UAE) and to create an atlas of cancer mortality for the UAE. This atlas is the first of its kind in the Gulf country and the Middle East. Death certificates were reviewed for a period from January 1, 1990 to December 31, 1999 and cancer deaths were identified. Cancer mortality cases were verified by comparing with medical records. Age-adjusted cancer mortality rates were calculated by gender, emirate/medical district and nationality (UAE nationals and overall UAE population). Individual rates for each emirate were compared to the overall rate of the corresponding population for the same cancer site and gender. Age-adjusted rates were mapped using MapInfo software. High rates for liver, lung and stomach cancer were observed in Abu Dhabi, Dubai and the northern emirates, respectively. Rates for UAE nationals were greater compared to the overall UAE population. Several factors were suggested that may account for high rates of specific cancers observed in certain emirates. It is hoped that this atlas will provide leads that will guide further epidemiologic and public health activities aimed at preventing cancer. ^
Resumo:
In the last two decades, trade liberalization under GATT/WTO has been partly offset by an increase in antidumping protection. Economists have argued convincingly that this is partly due to the inclusion of sales below cost in the definition of dumping during the GATT Tokyo Round. The introduction of the cost- based dumping definition gives regulating authorities a better opportunity to choose protection according to their liking. This paper investigates the domestic government's antidumping duty choice in an asymmetric information framework where the foreign firm's cost is observed by the domestic firm, but not by the government. To induce truthful revelation, the government can design a tariff schedule, contingent on firms' cost reports, accompanied by a threat to collect additional information for report verification (i.e., auditing) and, in case misreporting is detected, to set penalty duties. We show that depending on the concrete assumptions, the domestic government may not only be able to extract the true cost information, but also succeeds in implementing the full-information, governmental welfare-maximizing duty. In this case, the antidumping framework within GATT/WTO does not only offer the means to pursue strategic trade policy disguised as fair trade policy, but it also helps overcome the informational problems with regard to correctly determining the optimal strategic trade policy.
Resumo:
Purpose. To examine the association between living in proximity to Toxics Release Inventory (TRI) facilities and the incidence of childhood cancer in the State of Texas. ^ Design. This is a secondary data analysis utilizing the publicly available Toxics release inventory (TRI), maintained by the U.S. Environmental protection agency that lists the facilities that release any of the 650 TRI chemicals. Total childhood cancer cases and childhood cancer rate (age 0-14 years) by county, for the years 1995-2003 were used from the Texas cancer registry, available at the Texas department of State Health Services website. Setting: This study was limited to the children population of the State of Texas. ^ Method. Analysis was done using Stata version 9 and SPSS version 15.0. Satscan was used for geographical spatial clustering of childhood cancer cases based on county centroids using the Poisson clustering algorithm which adjusts for population density. Pictorial maps were created using MapInfo professional version 8.0. ^ Results. One hundred and twenty five counties had no TRI facilities in their region, while 129 facilities had at least one TRI facility. An increasing trend for number of facilities and total disposal was observed except for the highest category based on cancer rate quartiles. Linear regression analysis using log transformation for number of facilities and total disposal in predicting cancer rates was computed, however both these variables were not found to be significant predictors. Seven significant geographical spatial clusters of counties for high childhood cancer rates (p<0.05) were indicated. Binomial logistic regression by categorizing the cancer rate in to two groups (<=150 and >150) indicated an odds ratio of 1.58 (CI 1.127, 2.222) for the natural log of number of facilities. ^ Conclusion. We have used a unique methodology by combining GIS and spatial clustering techniques with existing statistical approaches in examining the association between living in proximity to TRI facilities and the incidence of childhood cancer in the State of Texas. Although a concrete association was not indicated, further studies are required examining specific TRI chemicals. Use of this information can enable the researchers and public to identify potential concerns, gain a better understanding of potential risks, and work with industry and government to reduce toxic chemical use, disposal or other releases and the risks associated with them. TRI data, in conjunction with other information, can be used as a starting point in evaluating exposures and risks. ^