30 resultados para Information Models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

According to Bandura (1997) efficacy beliefs are a primary determinant of motivation. Still, very little is known about the processes through which people integrate situational factors to form efficacy beliefs (Myers & Feltz, 2007). The aim of this study was to gain insight into the cognitive construction of subjective group-efficacy beliefs. Only with a sound understanding of those processes is there a sufficient base to derive psychological interventions aimed at group-efficacy beliefs. According to cognitive theories (e.g., Miller, Galanter, & Pribram, 1973) individual group-efficacy beliefs can be seen as the result of a comparison between the demands of a group task and the resources of the performing group. At the center of this comparison are internally represented structures of the group task and plans to perform it. The empirical plausibility of this notion was tested using functional measurement theory (Anderson, 1981). Twenty-three students (M = 23.30 years; SD = 3.39; 35 % females) of the University of Bern repeatedly judged the efficacy of groups in different group tasks. The groups consisted of the subjects and another one to two fictive group members. The latter were manipulated by their value (low, medium, high) in task-relevant abilities. Data obtained from multiple full factorial designs were structured with individuals as second level units and analyzed using mixed linear models. The task-relevant abilities of group members, specified as fixed factors, all had highly significant effects on subjects’ group-efficacy judgments. The effect sizes of the ability factors showed to be dependent on the respective abilities’ importance in a given task. In additive tasks (Steiner, 1972) group resources were integrated in a linear fashion whereas significant interaction between factors was obtained in interdependent tasks. The results also showed that people take into account other group members’ efficacy beliefs when forming their own group-efficacy beliefs. The results support the notion that personal group-efficacy beliefs are obtained by comparing the demands of a task with the performing groups’ resources. Psychological factors such as other team members’ efficacy beliefs are thereby being considered task relevant resources and affect subjective group-efficacy beliefs. This latter finding underlines the adequacy of multidimensional measures. While the validity of collective efficacy measures is usually estimated by how well they predict performances, the results of this study allow for a somewhat internal validity criterion. It is concluded that Information Integration Theory holds potential to further help understand people’s cognitive functioning in sport relevant situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we perform an extensive study of flavor observables in a two-Higgs-doublet model with generic Yukawa structure (of type III). This model is interesting not only because it is the decoupling limit of the minimal supersymmetric standard model but also because of its rich flavor phenomenology which also allows for sizable effects not only in flavor-changing neutral-current (FCNC) processes but also in tauonic B decays. We examine the possible effects in flavor physics and constrain the model both from tree-level processes and from loop observables. The free parameters of the model are the heavy Higgs mass, tanβ (the ratio of vacuum expectation values) and the “nonholomorphic” Yukawa couplings ϵfij(f=u,d,ℓ). In our analysis we constrain the elements ϵfij in various ways: In a first step we give order of magnitude constraints on ϵfij from ’t Hooft’s naturalness criterion, finding that all ϵfij must be rather small unless the third generation is involved. In a second step, we constrain the Yukawa structure of the type-III two-Higgs-doublet model from tree-level FCNC processes (Bs,d→μ+μ−, KL→μ+μ−, D¯¯¯0→μ+μ−, ΔF=2 processes, τ−→μ−μ+μ−, τ−→e−μ+μ− and μ−→e−e+e−) and observe that all flavor off-diagonal elements of these couplings, except ϵu32,31 and ϵu23,13, must be very small in order to satisfy the current experimental bounds. In a third step, we consider Higgs mediated loop contributions to FCNC processes [b→s(d)γ, Bs,d mixing, K−K¯¯¯ mixing and μ→eγ] finding that also ϵu13 and ϵu23 must be very small, while the bounds on ϵu31 and ϵu32 are especially weak. Furthermore, considering the constraints from electric dipole moments we obtain constrains on some parameters ϵu,ℓij. Taking into account the constraints from FCNC processes we study the size of possible effects in the tauonic B decays (B→τν, B→Dτν and B→D∗τν) as well as in D(s)→τν, D(s)→μν, K(π)→eν, K(π)→μν and τ→K(π)ν which are all sensitive to tree-level charged Higgs exchange. Interestingly, the unconstrained ϵu32,31 are just the elements which directly enter the branching ratios for B→τν, B→Dτν and B→D∗τν. We show that they can explain the deviations from the SM predictions in these processes without fine-tuning. Furthermore, B→τν, B→Dτν and B→D∗τν can even be explained simultaneously. Finally, we give upper limits on the branching ratios of the lepton flavor-violating neutral B meson decays (Bs,d→μe, Bs,d→τe and Bs,d→τμ) and correlate the radiative lepton decays (τ→μγ, τ→eγ and μ→eγ) to the corresponding neutral current lepton decays (τ−→μ−μ+μ−, τ−→e−μ+μ− and μ−→e−e+e−). A detailed Appendix contains all relevant information for the considered processes for general scalar-fermion-fermion couplings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Anaemia in rheumatoid arthritis (RA) is prototypical of the chronic disease type and is often neglected in clinical practice. We studied anaemia in relation to disease activity, medications and radiographic progression. METHODS: Data were collected between 1996 and 2007 over a mean follow-up of 2.2 years. Anaemia was defined according to WHO (♀ haemoglobin<12 g/dl, ♂: haemoglobin<13 g/dl), or alternative criteria. Anaemia prevalence was studied in relation to disease parameters and pharmacological therapy. Radiographic progression was analysed in 9731 radiograph sets from 2681 patients in crude longitudinal regression models and after adjusting for potential confounding factors, including the clinical disease activity score with the 28-joint count for tender and swollen joints and erythrocyte sedimentation rate (DAS28ESR) or the clinical disease activity index (cDAI), synthetic antirheumatic drugs and antitumour necrosis factor (TNF) therapy. RESULTS: Anaemia prevalence decreased from more than 24% in years before 2001 to 15% in 2007. Erosions progressed significantly faster in patients with anaemia (p<0.001). Adjusted models showed these effects independently of clinical disease activity and other indicators of disease severity. Radiographic damage progression rates were increasing with severity of anaemia, suggesting a 'dose-response effect'. The effect of anaemia on damage progression was maintained in subgroups of patients treated with TNF blockade or corticosteroids, and without non-selective nonsteroidal anti-inflammatory drugs (NSAIDs). CONCLUSIONS: Anaemia in RA appears to capture disease processes that remain unmeasured by established disease activity measures in patients with or without TNF blockade, and may help to identify patients with more rapid erosive disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With improving clinical CT scanning technology, the accuracy of CT-based finite element (FE) models of the human skeleton may be ameliorated by an enhanced description of apparent level bone mechanical properties. Micro-finite element (μFE) modeling can be used to study the apparent elastic behavior of human cancellous bone. In this study, samples from the femur, radius and vertebral body were investigated to evaluate the predictive power of morphology–elasticity relationships and to compare them across different anatomical regions. μFE models of 701 trabecular bone cubes with a side length of 5.3 mm were analyzed using kinematic boundary conditions. Based on the FE results, four morphology–elasticity models using bone volume fraction as well as full, limited or no fabric information were calibrated for each anatomical region. The 5 parameter Zysset–Curnier model using full fabric information showed excellent predictive power with coefficients of determination ( r2adj ) of 0.98, 0.95 and 0.94 of the femur, radius and vertebra data, respectively, with mean total norm errors between 14 and 20%. A constant orthotropy model and a constant transverse isotropy model, where the elastic anisotropy is defined by the model parameters, yielded coefficients of determination between 0.90 and 0.98 with total norm errors between 16 and 25%. Neglecting fabric information and using an isotropic model led to r2adj between 0.73 and 0.92 with total norm errors between 38 and 49%. A comparison of the model regressions revealed minor but significant (p<0.01) differences for the fabric–elasticity model parameters calibrated for the different anatomical regions. The proposed models and identified parameters can be used in future studies to compute the apparent elastic properties of human cancellous bone for homogenized FE models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The maintenance of genetic variation in a spatially heterogeneous environment has been one of the main research themes in theoretical population genetics. Despite considerable progress in understanding the consequences of spatially structured environments on genetic variation, many problems remain unsolved. One of them concerns the relationship between the number of demes, the degree of dominance, and the maximum number of alleles that can be maintained by selection in a subdivided population. In this work, we study the potential of maintaining genetic variation in a two-deme model with deme-independent degree of intermediate dominance, which includes absence of G x E interaction as a special case. We present a thorough numerical analysis of a two-deme three-allele model, which allows us to identify dominance and selection patterns that harbor the potential for stable triallelic equilibria. The information gained by this approach is then used to construct an example in which existence and asymptotic stability of a fully polymorphic equilibrium can be proved analytically. Noteworthy, in this example the parameter range in which three alleles can coexist is maximized for intermediate migration rates. Our results can be interpreted in a specialist-generalist context and (among others) show when two specialists can coexist with a generalist in two demes if the degree of dominance is deme independent and intermediate. The dominance relation between the generalist allele and the specialist alleles play a decisive role. We also discuss linear selection on a quantitative trait and show that G x E interaction is not necessary for the maintenance of more than two alleles in two demes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Satellite remote sensing provides a powerful instrument for mapping and monitoring traces of historical settlements and infrastructure, not only in distant areas and crisis regions. It helps archaeologists to embed their findings from field surveys into the broader context of the landscape. With the start of the TanDEM-X mission, spatially explicit 3D-information is available to researchers at an unprecedented resolution worldwide. We examined different experimental TanDEM-X digital elevation models (DEM) that were processed from two different imaging modes (Stripmap/High Resolution Spotlight) using the operational alternating bistatic acquisition mode. The quality and accuracy of the experimental DEM products was compared to other available DEM products and a high precision archaeological field survey. The results indicate the potential of TanDEM-X Stripmap (SM) data for mapping surface elements at regional scale. For the alluvial plain of Cilicia, a suspected palaeochannel could be reconstructed. At the local scale, DEM products from TanDEM-X High Resolution Spotlight (HS) mode were processed at 2 m spatial resolution using a merge of two monostatic/bistatic interferograms. The absolute and relative vertical accuracy of the outcome meet the specification of high resolution elevation data (HRE) standards from the National System for Geospatial Intelligence (NSG) at the HRE20 level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an application and sample independent method for the automatic discrimination of noise and signal in optical coherence tomography Bscans. The proposed algorithm models the observed noise probabilistically and allows for a dynamic determination of image noise parameters and the choice of appropriate image rendering parameters. This overcomes the observer variability and the need for a priori information about the content of sample images, both of which are challenging to estimate systematically with current systems. As such, our approach has the advantage of automatically determining crucial parameters for evaluating rendered image quality in a systematic and task independent way. We tested our algorithm on data from four different biological and nonbiological samples (index finger, lemon slices, sticky tape, and detector cards) acquired with three different experimental spectral domain optical coherence tomography (OCT) measurement systems including a swept source OCT. The results are compared to parameters determined manually by four experienced OCT users. Overall, our algorithm works reliably regardless of which system and sample are used and estimates noise parameters in all cases within the confidence interval of those found by observers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld, and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management, and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present studies of 9 modern (up to 400-yr-old) peat sections from Slovenia, Switzerland, Austria, Italy, and Finland. Precise radiocarbon dating of modern samples is possible due to the large bomb peak of atmospheric 14C concentration in 1963 and the following rapid decline in the 14C level. All the analyzed 14C profiles appeared concordant with the shape of the bomb peak of atmospheric 14C concentration, integrated over some time interval with a length specific to the peat section. In the peat layers covered by the bomb peak, calendar ages of individual peat samples could be determined almost immediately, with an accuracy of 23 yr. In the pre-bomb sections, the calendar ages of individual dated samples are determined in the form of multi-modal probability distributions of about 300 yr wide (about AD 16501950). However, simultaneous use of the post-bomb and pre-bomb 14C dates, and lithological information, enabled the rejection of most modes of probability distributions in the pre-bomb section. In effect, precise age-depth models of the post-bomb sections have been extended back in time, into the wiggly part of the 14C calibration curve.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of the present thesis was to investigate the production of code-switched utterances in bilinguals’ speech production. This study investigates the availability of grammatical-category information during bilingual language processing. The specific aim is to examine the processes involved in the production of Persian-English bilingual compound verbs (BCVs). A bilingual compound verb is formed when the nominal constituent of a compound verb is replaced by an item from the other language. In the present cases of BCVs the nominal constituents are replaced by a verb from the other language. The main question addressed is how a lexical element corresponding to a verb node can be placed in a slot that corresponds to a noun lemma. This study also investigates how the production of BCVs might be captured within a model of BCVs and how such a model may be integrated within incremental network models of speech production. In the present study, both naturalistic and experimental data were used to investigate the processes involved in the production of BCVs. In the first part of the present study, I collected 2298 minutes of a popular Iranian TV program and found 962 code-switched utterances. In 83 (8%) of the switched cases, insertions occurred within the Persian compound verb structure, hence, resulting in BCVs. As to the second part of my work, a picture-word interference experiment was conducted. This study addressed whether in the case of the production of Persian-English BCVs, English verbs compete with the corresponding Persian compound verbs as a whole, or whether English verbs compete with the nominal constituents of Persian compound verbs only. Persian-English bilinguals named pictures depicting actions in 4 conditions in Persian (L1). In condition 1, participants named pictures of action using the whole Persian compound verb in the context of its English equivalent distractor verb. In condition 2, only the nominal constituent was produced in the presence of the light verb of the target Persian compound verb and in the context of a semantically closely related English distractor verb. In condition 3, the whole Persian compound verb was produced in the context of a semantically unrelated English distractor verb. In condition 4, only the nominal constituent was produced in the presence of the light verb of the target Persian compound verb and in the context of a semantically unrelated English distractor verb. The main effect of linguistic unit was significant by participants and items. Naming latencies were longer in the nominal linguistic unit compared to the compound verb (CV) linguistic unit. That is, participants were slower to produce the nominal constituent of compound verbs in the context of a semantically closely related English distractor verb compared to producing the whole compound verbs in the context of a semantically closely related English distractor verb. The three-way interaction between version of the experiment (CV and nominal versions), linguistic unit (nominal and CV linguistic units), and relation (semantically related and unrelated distractor words) was significant by participants. In both versions, naming latencies were longer in the semantically related nominal linguistic unit compared to the response latencies in the semantically related CV linguistic unit. In both versions, naming latencies were longer in the semantically related nominal linguistic unit compared to response latencies in the semantically unrelated nominal linguistic unit. Both the analysis of the naturalistic data and the results of the experiment revealed that in the case of the production of the nominal constituent of BCVs, a verb from the other language may compete with a noun from the base language, suggesting that grammatical category does not necessarily provide a constraint on lexical access during the production of the nominal constituent of BCVs. There was a minimal context in condition 2 (the nominal linguistic unit) in which the nominal constituent was produced in the presence of its corresponding light verb. The results suggest that generating words within a context may not guarantee that the effect of grammatical class becomes available. A model is proposed in order to characterize the processes involved in the production of BCVs. Implications for models of bilingual language production are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analogue and finite element numerical models with frictional and viscous properties are used to model thrust wedge development. Comparison between model types yields valuable information about analogue model evolution, scaling laws and the relative strengths and limitations of the techniques. Both model types show a marked contrast in structural style between ‘frictional-viscous domains’ underlain by a thin viscous layer and purely ‘frictional domains’. Closely spaced thrusts form a narrow and highly asymmetric fold-and-thrust belt in the frictional domain, characterized by in-sequence propagation of forward thrusts. In contrast, the frictional-viscous domain shows a wide and low taper wedge and a thrust belt with a more symmetrical vergence, with both forward and back thrusts. The frictional-viscous domain numerical models show that the viscous layer initially simple shears as deformation propagates along it, while localized deformation resulting in the formation of a pop-up structure occurs in the overlying frictional layers. In both domains, thrust shear zones in the numerical model are generally steeper than the equivalent faults in the analogue model, because the finite element code uses a non-associated plasticity flow law. Nevertheless, the qualitative agreement between analogue and numerical models is encouraging. It shows that the continuum approximation used in numerical models can be used to model frictional materials, such as sand, provided caution is taken to properly scale the experiments, and some of the limitations are taken into account.