927 resultados para Endogenous Information Structure


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was designed to answer the question of which direction the restructuring of financial regulators should take – consolidation or fragmentation. This research began by examining the need for financial regulation and its related costs. It then continued to describe what types of regulatory structures exist in the world; surveying the regulatory structures in 15 jurisdictions, comparing them and discussing their strengths and weaknesses. This research analyzed the possible regulatory structures using three methodological tools: Game-Theory, Institutional-Design, and Network-Effects. The incentives for regulatory action were examined in Chapter Four using game theory concepts. This chapter predicted how two regulators with overlapping supervisory mandates will behave in two different states of the world (where they can stand to benefit from regulating and where they stand to lose). The insights derived from the games described in this chapter were then used to analyze the different supervisory models that exist in the world. The problem of information-flow was discussed in Chapter Five using tools from institutional design. The idea is based on the need for the right kind of information to reach the hands of the decision maker in the shortest time possible in order to predict, mitigate or stop a financial crisis from occurring. Network effects and congestion in the context of financial regulation were discussed in Chapter Six which applied the literature referring to network effects in general in an attempt to conclude whether consolidating financial regulatory standards on a global level might also yield other positive network effects. Returning to the main research question, this research concluded that in general the fragmented model should be preferable to the consolidated model in most cases as it allows for greater diversity and information-flow. However, in cases in which close cooperation between two authorities is essential, the consolidated model should be used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polymer brushes have unique properties with a large variety of possible applications ranging from responsive coatings and drug delivery to lubrication and sensing. For further development a detailed understanding of the properties is needed. Established characterization methods, however, only supply information of the surface. Experimental data about the inner “bulk” structure of polymer brushes is still missing.rnScattering methods under grazing incidence supply structural information of surfaces as well as structures beneath it. Nanomechanical cantilevers supply stress data, which is giving information about the forces acting inside the polymer brush film. In this thesis these two techniques are further developed and used to deepen the understanding of polymer brushes. rnThe experimental work is divided into four chapters. Chapter 2 deals with the preparation of polymer brushes on top of nanomechanical cantilever sensors as well as large area sample by using a “grafting-to” technique. The further development of nanomechanical cantilever readout is subject of chapter 3. In order to simplify cantilever sensing, a method is investigated which allows one to perform multiple bending experiments on top of a single cantilever. To do so, a way to correlate different curvatures is introduced as well as a way to conveniently locate differently coated segments. In chapter 4 the change in structure upon solvent treatment of mixed polymer brushes is investigated by using scattering methods and nanomechanical cantilevers amongst others. This allows one to explain the domain memory effect, which is typically found in such systems. Chapter 5 describes the implementation of a phase shifting interferometer - used for readout of nanomechanical cantilevers - into the µ-focused scattering beamline BW4, allowing simultaneous measurements of stress and structure information. The last experimental chapter 6 deals with the roughness correlation in polymer brushes and its dependence on the chain tethered density.rnIn summary, the thesis deals with utilization of new experimental techniques for the investigation of polymer brushes and further development of the techniques themselves.rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis investigates the nucleon structure probed by the electromagnetic interaction. One of the most basic observables, reflecting the electromagnetic structure of the nucleon, are the form factors, which have been studied by means of elastic electron-proton scattering with ever increasing precision for several decades. In the timelike region, corresponding with the proton-antiproton annihilation into a electron-positron pair, the present experimental information is much less accurate. However, in the near future high-precision form factor measurements are planned. About 50 years after the first pioneering measurements of the electromagnetic form factors, polarization experiments stirred up the field since the results were found to be in striking contradiction to the findings of previous form factor investigations from unpolarized measurements. Triggered by the conflicting results, a whole new field studying the influence of two-photon exchange corrections to elastic electron-proton scattering emerged, which appeared as the most likely explanation of the discrepancy. The main part of this thesis deals with theoretical studies of two-photon exchange, which is investigated particularly with regard to form factor measurements in the spacelike as well as in the timelike region. An extraction of the two-photon amplitudes in the spacelike region through a combined analysis using the results of unpolarized cross section measurements and polarization experiments is presented. Furthermore, predictions of the two-photon exchange effects on the e+p/e-p cross section ratio are given for several new experiments, which are currently ongoing. The two-photon exchange corrections are also investigated in the timelike region in the process pbar{p} -> e+ e- by means of two factorization approaches. These corrections are found to be smaller than those obtained for the spacelike scattering process. The influence of the two-photon exchange corrections on cross section measurements as well as asymmetries, which allow a direct access of the two-photon exchange contribution, is discussed. Furthermore, one of the factorization approaches is applied for investigating the two-boson exchange effects in parity-violating electron-proton scattering. In the last part of the underlying work, the process pbar{p} -> pi0 e+e- is analyzed with the aim of determining the form factors in the so-called unphysical, timelike region below the two-nucleon production threshold. For this purpose, a phenomenological model is used, which provides a good description of the available data of the real photoproduction process pbar{p} -> pi0 gamma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the early 20th century, Gouy, Chapman, and Stern developed a theory to describe the capacitance and the spatial ion distribution of diluted electrolytes near an electrode. After a century of research, considerable progress has been made in the understanding of the electrolyte/electrode interface. However, its molecular-scale structure and its variation with an applied potential is still under debate. In particular for room-temperature ionic liquids, a new class of solventless electrolytes, the classical theories for the electrical double layer are not applicable. Recently, molecular dynamics simulations and phenomenological theories have attempted to explain the capacitance of the ionic liquid/electrode interface with the molecular-scale structure and dynamics of the ionic liquid near the electrode. rnHowever, experimental evidence is very limited. rnrnIn the presented study, the ion distribution of an ionic liquid near an electrode and its response to applied potentials was examined with sub-molecular resolution. For this purpose, a new sample chamber was constructed, allowing in situ high energy X-ray reflectivity experiments under potential control, as well as impedance spectroscopy measurements. The combination of structural information and electrochmical data provided a comprehensive picture of the electric double layer in ionic liquids. Oscillatory charge density profiles were found, consisting of alternating anion- and cation-enriched layers at both, cathodic and anodic, potentials. This structure was shown to arise from the same ion-ion correlations dominating the liquid bulk structure that were observed as a distinct X-ray diffraction peak. Therefore, existing physically motivated models were refined and verified by comparison with independent measurements. rnrnThe relaxation dynamics of the interfacial structure upon potential variation were studied by time resolved X-ray reflectivity experiments with sub-millisecond resolution. The observed relaxation times during charging/discharging are consistent with the impedance spectroscopy data revealing three processes of vastly different characteristic time-scales. Initially, the ion transport normal to the interface happens on a millisecond-scale. Another 100-millisecond-scale process is associated with molecular reorientation of electrode-adsorbed cations. Further, a minute-scale relaxation was observed, which is tentatively assigned to lateral ordering within the first layer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abdominal pain can be induced by stimulation of visceral nociceptors. Activation of nociceptors usually requires previous sensitization by pathological events, such as inflammation, ischemia or acidosis. Although abdominal pain can obviously be caused by pathology of a visceral structure, clinicians frequently observe that such a pathology explains only part of the pain complaints. Occasionally, there is lack of objective signs of visceral lesions. There is clear evidence that pain states are associated with profound changes of the central processing of the sensory input. The main consequences of such alterations for patients are twofold: 1) a central sensitization, i.e. an increased excitability of the central nervous system; 2) an alteration of the endogenous pain modulation, which under normal conditions inhibits the processing of nociceptive signals in the central nervous system. Both phenomena lead to a spread of pain to other body regions and an amplification of the pain perception. The interactions between visceral pathology and alterations of the central pain processes represent an at least partial explanation for the discrepancy between objective signs of peripheral lesions and severity of the symptoms. Today, both central hypersensitivity and alteration in endogenous pain modulation can be measured in clinical practice. This information can be used to provide the patients with an explanatory model for their pain. Furthermore, first data suggest that alterations in central pain processing may represent negative prognostic factors. A better understanding of the individual pathophysiology may allow in the future the development of individual therapeutic strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-resolution microscopy techniques provide a plethora of information on biological structures from the cellular level down to the molecular level. In this review, we present the unique capabilities of transmission electron and atomic force microscopy to assess the structure, oligomeric state, function and dynamics of channel and transport proteins in their native environment, the lipid bilayer. Most importantly, membrane proteins can be visualized in the frozen-hydrated state and in buffer solution by cryo-transmission electron and atomic force microscopy, respectively. We also illustrate the potential of the scintillation proximity assay to study substrate binding of detergent-solubilized transporters prior to crystallization and structural characterization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The group analysed some syntactic and phonological phenomena that presuppose the existence of interrelated components within the lexicon, which motivate the assumption that there are some sublexicons within the global lexicon of a speaker. This result is confirmed by experimental findings in neurolinguistics. Hungarian speaking agrammatic aphasics were tested in several ways, the results showing that the sublexicon of closed-class lexical items provides a highly automated complex device for processing surface sentence structure. Analysing Hungarian ellipsis data from a semantic-syntactic aspect, the group established that the lexicon is best conceived of being as split into at least two main sublexicons: the store of semantic-syntactic feature bundles and a separate store of sound forms. On this basis they proposed a format for representing open-class lexical items whose meanings are connected via certain semantic relations. They also proposed a new classification of verbs to account for the contribution of the aspectual reading of the sentence depending on the referential type of the argument, and a new account of the syntactic and semantic behaviour of aspectual prefixes. The partitioned sets of lexical items are sublexicons on phonological grounds. These sublexicons differ in terms of phonotactic grammaticality. The degrees of phonotactic grammaticality are tied up with the problem of psychological reality, of how many degrees of this native speakers are sensitive to. The group developed a hierarchical construction network as an extension of the original General Inheritance Network formalism and this framework was then used as a platform for the implementation of the grammar fragments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The project drew on an extensive firm-level sample of employees to describe in detail the recent evolution of the structure of wages in the Czech Republic between 1995 and 1998. The results of the analysis were then compared with information from EU countries. Regression analysis was used to study a number of specific questions, with particular emphasis being paid to proper weighting of the sample. Jurajda first quantified the effects on male and female hourly wages in the Czech Republic of worker age and education, firm size, region, industry and ownership type. He then examined whether these effects have been changing over time and how they differ by gender, and identified those industrial sectors that carry the largest wage premiums not accounted for by worker or firm characteristics, and measured the effect of unemployment on wages. He found a substantial increase in returns on human capital, with the earning differentials for education increasing substantially between 1995 and 1998, with these gains being largely comparable to those in western countries. Overall, the Czech structure of wages is now very responsive to market forces and is converging rapidly on EU-type flexibility in almost every dimension. It is likely, however, that due to the constrained supply of tertiary-educated workers in particular, the returns on education may keep on rising, surpassing levels typical of western economies and potentially reaching the high levels observed in developing countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first outcome of this project was a synchronous description of the most widely spoken Romani dialect in the Czech and Slovak Republics, aimed at teachers and lecturers of the Romani language. This is intended to serve as a methodological guide for the demonstration of various grammatical phenomena, but may also assist people who want a basic knowledge of the linguistic structure of this neo-Indian language. The grammatical material is divided into 23 chapters, in a sequence which may be followed in teaching or studying. The book includes examples of the grammatical elements, but not exercises or articles. The second work produced was a textbook of Slovak Romani, which is the most detailed in the Czech or Slovak Republics to date. It is aimed at all those interested in active use of the Romani language: high school and university students, people working with the Roma, and Roma who speak little or nothing of the language of their forebears, The book includes 34 lessons, each containing relevant Romani tests (articles and dialogues), a short vocabulary list, grammatical explanations, exercises and examples of Romani written or oral expression. The textbook also contains a considerable amount of ethno-cultural information and notes on the life and traditions of the Roman, as well as pointing out some differences between different dialects. A brief Romani-Czech phrase book is included as an appendix.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The study is part of a nationwide evaluation of complementary and alternative medicine (CAM) in primary care in Switzerland. The goal was to evaluate the extent and structure of basic health insurance expenditures for complementary and alternative medicine in Swiss primary care. METHODS: The study was designed as a cross-sectional evaluation of Swiss primary care providers and included 262 certified CAM physicians, 151 noncertified CAM physicians and 172 conventional physicians. The study was based on data from a mailed questionnaire and on reimbursement information obtained from health insurers. It was therefore purely observational, without interference into diagnostic and therapeutic procedures applied or prescribed by physicians. Main outcome measures included average reimbursed costs per patient, structured into consultation- and medication-related costs, and referred costs. RESULTS: Total average reimbursed cost per patient did not differ between CAM physicians and conventional practitioners, but considerable differences were observed in cost structure. The proportions of reimbursed costs for consultation time were 56% for certified CAM, 41% for noncertified CAM physicians and 40% for conventional physicians; medication costs--including expenditures for prescriptions and directly dispensed drugs--respectively accounted for 35%, 18%, and 51% of costs. CONCLUSION: The results indicate no significant difference for overall treatment cost per patient between CAM and COM primary care in Switzerland. However, CAM physicians treat lower numbers of patients and a more cost-favourable patient population than conventional physicians. Differences in cost structure reflect more patient-centred and individualized treatment modalities of CAM physicians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of information technology (IT) in dentistry is far ranging. In order to produce a working document for the dental educator, this paper focuses on those methods where IT can assist in the education and competence development of dental students and dentists (e.g. e-learning, distance learning, simulations and computer-based assessment). Web pages and other information-gathering devices have become an essential part of our daily life, as they provide extensive information on all aspects of our society. This is mirrored in dental education where there are many different tools available, as listed in this report. IT offers added value to traditional teaching methods and examples are provided. In spite of the continuing debate on the learning effectiveness of e-learning applications, students request such approaches as an adjunct to the traditional delivery of learning materials. Faculty require support to enable them to effectively use the technology to the benefit of their students. This support should be provided by the institution and it is suggested that, where possible, institutions should appoint an e-learning champion with good interpersonal skills to support and encourage faculty change. From a global prospective, all students and faculty should have access to e-learning tools. This report encourages open access to e-learning material, platforms and programs. The quality of such learning materials must have well defined learning objectives and involve peer review to ensure content validity, accuracy, currency, the use of evidence-based data and the use of best practices. To ensure that the developers' intellectual rights are protected, the original content needs to be secure from unauthorized changes. Strategies and recommendations on how to improve the quality of e-learning are outlined. In the area of assessment, traditional examination schemes can be enriched by IT, whilst the Internet can provide many innovative approaches. Future trends in IT will evolve around improved uptake and access facilitated by the technology (hardware and software). The use of Web 2.0 shows considerable promise and this may have implications on a global level. For example, the one-laptop-per-child project is the best example of what Web 2.0 can do: minimal use of hardware to maximize use of the Internet structure. In essence, simple technology can overcome many of the barriers to learning. IT will always remain exciting, as it is always changing and the users, whether dental students, educators or patients are like chameleons adapting to the ever-changing landscape.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In traffic accidents with pedestrians, cyclists or motorcyclists, patterned impact injuries as well as marks on clothes can be matched to the injury-causing vehicle structure in order to reconstruct the accident and identify the vehicle which has hit the person. Therefore, the differentiation of the primary impact injuries from other injuries is of great importance. Impact injuries can be identified on the external injuries of the skin, the injured subcutaneous and fat tissue, as well as the fractured bones. Another sign of impact is a bone bruise. The bone bruise, or occult bone lesion, means a bleeding in the subcortical bone marrow, which is presumed to be the result of micro-fractures of the medullar trabeculae. The aim of this study was to prove that bleeding in the subcortical bone marrow of the deceased can be detected using the postmortem noninvasive magnetic resonance imaging. This is demonstrated in five accident cases, four involving pedestrians and one a cyclist, where bone bruises were detected in different bones as a sign of impact occurring in the same location as the external and soft tissue impact injuries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cellular uptake of di- and tripeptides has been characterized in numerous organisms, and various transporters have been identified. In contrast, structural information on peptide transporters is very sparse. Here, we have cloned, overexpressed, purified, and biochemically characterized DtpD (YbgH) from Escherichia coli, a prokaryotic member of the peptide transporter family. Its homologues in mammals, PEPT1 (SLC15A1) and PEPT2 (SLC15A2), not only transport peptides but also are of relevance for uptake of drugs as they accept a large spectrum of peptidomimetics such as beta-lactam antibiotics, antivirals, peptidase inhibitors, and others as substrates. Uptake experiments indicated that DtpD functions as a canonical peptide transporter and is, therefore, a valid model for structural studies of this family of proteins. Blue native polyacrylamide gel electrophoresis, gel filtration, and transmission electron microscopy of single-DtpD particles suggest that the transporter exists in a monomeric form when solubilized in detergent. Two-dimensional crystallization of DtpD yielded first tubular crystals that allowed the determination of a projection structure at better than 19 A resolution. This structure of DtpD represents the first structural view of a member of the peptide transporter family.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the structure and utilization of a computerized databank system for WHO mortality data. This system makes available "at finger-tips" data which previously were published by WHO in its blue volumes. The data can be handled much more flexible. At the moment the system provides information on age-standardized rates (direct standardization), total number of cases, as well as cover per age-group and year for about a hundred countries. The time period covered is 1950-1985, with exceptions for data which are not available to WHO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Code queries focus mainly on the static structure of a system. To comprehend the dynamic behavior of a system however, a software engineer needs to be able to reason about the dynamics of this system, for instance by querying a database of dynamic information. Such a querying mechanism should be directly available in the IDE where the developers implements, navigates and reasons about the software system. We propose (i) concepts to gather dynamic information, (ii) the means to query this information, and (iii) tools and techniques to integrate querying of dynamic information in the IDE, including the presentation of results generated by queries.