36 resultados para Semi-Empirical Methods

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In French, a causal relation is often conveyed by the connectives car, parce que or puisque. Since the seminal work of the Lambda-l Group (1975), it has generally been assumed that parce que, used to relate semantic content, contrasts with car and puisque, both used to connect either speech act or epistemic content. However, this analysis leaves a number of questions unanswered. In this paper, I present a reanalysis of this trio, using empirical methods such as corpus analysis and constrained elicitation. Results indicate that car and parce que are interchangeable in many contexts, even if they are still prototypically used in their respective domain in writing. As for puisque, its distribution does not overlap with car, despite their similar domains of use. I argue that the specificity of puisque with respect to the other two connectives is to introduce a cause with an echoic meaning.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The accuracy of Global Positioning System (GPS) time series is degraded by the presence of offsets. To assess the effectiveness of methods that detect and remove these offsets, we designed and managed the Detection of Offsets in GPS Experiment. We simulated time series that mimicked realistic GPS data consisting of a velocity component, offsets, white and flicker noises (1/f spectrum noises) composed in an additive model. The data set was made available to the GPS analysis community without revealing the offsets, and several groups conducted blind tests with a range of detection approaches. The results show that, at present, manual methods (where offsets are hand picked) almost always give better results than automated or semi‒automated methods (two automated methods give quite similar velocity bias as the best manual solutions). For instance, the fifth percentile range (5% to 95%) in velocity bias for automated approaches is equal to 4.2 mm/year (most commonly ±0.4 mm/yr from the truth), whereas it is equal to 1.8 mm/yr for the manual solutions (most commonly 0.2 mm/yr from the truth). The magnitude of offsets detectable by manual solutions is smaller than for automated solutions, with the smallest detectable offset for the best manual and automatic solutions equal to 5 mm and 8 mm, respectively. Assuming the simulated time series noise levels are representative of real GPS time series, robust geophysical interpretation of individual site velocities lower than 0.2–0.4 mm/yr is therefore certainly not robust, although a limit of nearer 1 mm/yr would be a more conservative choice. Further work to improve offset detection in GPS coordinates time series is required before we can routinely interpret sub‒mm/yr velocities for single GPS stations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This tutorial review article is intended to provide a general guidance to a reader interested to learn about the methodologies to obtain accurate electron density mapping in molecules and crystalline solids, from theory or from experiment, and to carry out a sensible interpretation of the results, for chemical, biochemical or materials science applications. The review mainly focuses on X-ray diffraction techniques and refinement of experimental models, in particular multipolar models. Neutron diffraction, which was widely used in the past to fix accurate positions of atoms, is now used for more specific purposes. The review illustrates three principal analyses of the experimental or theoretical electron density, based on quantum chemical, semi-empirical or empirical interpretation schemes, such as the quantum theory of atoms in molecules, the semi-classical evaluation of interaction energies and the Hirshfeld analysis. In particular, it is shown that a simple topological analysis based on a partition of the electron density cannot alone reveal the whole nature of chemical bonding. More information based on the pair density is necessary. A connection between quantum mechanics and observable quantities is given in order to provide the physical grounds to explain the observations and to justify the interpretations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Wer die auf Geistes- und Sozialwissenschaften basierende Literatur aus dem Kanon der Geschlechtertheorie betrachtet, erhält den Eindruck, dass die Psychologie innerhalb dieses Forschungsbereichs keine tragende Rolle spielt. Ein möglicher Grund für die fehlende Integration psychologischer Forschung scheint ihr Zugriff auf quantitative empirische Methoden zu sein, ein Ansatz, der für die naturwissenschaftlich orientierte psychologische Forschung zentral ist. In diesem Artikel wollen wir eine Lanze brechen für eine geschlechter theoretisch informierte quantitative Experimentalpsychologie. Anhand unseres Forschungsgebietes Psychologie der Sprache illustrieren wir, an welchen Punkten die neueren behavioralen und neurowissenschaftlichen Methoden einen Beitrag leisten können und wie sie Erkenntnisse aus der qualitativen Genderforschung komplementieren. Der erste Teil befasst sich mit aktuellen Studien, die unter anderem mit Reaktionszeitmessungen und evozierten Potenzialen zeigen, wie stark Genderstereotypien in der Semantik verankert sind. Der zweite Teil thematisiert neuere Befunde aus der Neurobildgebung, die Geschlechtsunterschiede in der Lateralisierung von Sprachverarbeitung infrage stellen. Abschließend skizzieren wir neuere Forschungsansätze und plädieren für eine transdiziplinäre Kombination von qualitativen und quantitativen Methoden.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Business and Information Technologies (BIT) project strives to reveal new insights into how modern IT impacts organizational structures and business practices using empirical methods. Due to its international scope, it allows for inter-country comparison of empirical results. Germany — represented by the European School of Management and Technologies (ESMT) and the Institute of Information Systems at Humboldt-Universität zu Berlin — joined the BIT project in 2006. This report presents the result of the first survey conducted in Germany during November–December 2006. The key results are as follows: • The most widely adopted technologies and systems in Germany are websites, wireless hardware and software, groupware/productivity tools, and enterprise resource planning (ERP) systems. The biggest potential for growth exists for collaboration and portal tools, content management systems, business process modelling, and business intelligence applications. A number of technological solutions have not yet been adopted by many organizations but also bear some potential, in particular identity management solutions, Radio Frequency Identification (RFID), biometrics, and third-party authentication and verification. • IT security remains on the top of the agenda for most enterprises: budget spending was increasing in the last 3 years. • The workplace and work requirements are changing. IT is used to monitor employees' performance in Germany, but less heavily compared to the United States (Karmarkar and Mangal, 2007).1 The demand for IT skills is increasing at all corporate levels. Executives are asking for more and better structured information and this, in turn, triggers the appearance of new decision-making tools and online technologies on the market. • The internal organization of companies in Germany is underway: organizations are becoming flatter, even though the trend is not as pronounced as in the United States (Karmarkar and Mangal, 2007), and the geographical scope of their operations is increasing. Modern IT plays an important role in enabling this development, e.g. telecommuting, teleconferencing, and other web-based collaboration formats are becoming increasingly popular in the corporate context. • The degree to which outsourcing is being pursued is quite limited with little change expected. IT services, payroll, and market research are the most widely outsourced business functions. This corresponds to the results from other countries. • Up to now, the adoption of e-business technologies has had a rather limited effect on marketing functions. Companies tend to extract synergies from traditional printed media and on-line advertising. • The adoption of e-business has not had a major impact on marketing capabilities and strategy yet. Traditional methods of customer segmentation are still dominating. The corporate identity of most organizations does not change significantly when going online. • Online sales channel are mainly viewed as a complement to the traditional distribution means. • Technology adoption has caused production and organizational costs to decrease. However, the costs of technology acquisition and maintenance as well as consultancy and internal communication costs have increased.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

XMapTools is a MATLAB©-based graphical user interface program for electron microprobe X-ray image processing, which can be used to estimate the pressure–temperature conditions of crystallization of minerals in metamorphic rocks. This program (available online at http://www.xmaptools.com) provides a method to standardize raw electron microprobe data and includes functions to calculate the oxide weight percent compositions for various minerals. A set of external functions is provided to calculate structural formulae from the standardized analyses as well as to estimate pressure–temperature conditions of crystallization, using empirical and semi-empirical thermobarometers from the literature. Two graphical user interface modules, Chem2D and Triplot3D, are used to plot mineral compositions into binary and ternary diagrams. As an example, the software is used to study a high-pressure Himalayan eclogite sample from the Stak massif in Pakistan. The high-pressure paragenesis consisting of omphacite and garnet has been retrogressed to a symplectitic assemblage of amphibole, plagioclase and clinopyroxene. Mineral compositions corresponding to ~165,000 analyses yield estimates for the eclogitic pressure–temperature retrograde path from 25 kbar to 9 kbar. Corresponding pressure–temperature maps were plotted and used to interpret the link between the equilibrium conditions of crystallization and the symplectitic microstructures. This example illustrates the usefulness of XMapTools for studying variations of the chemical composition of minerals and for retrieving information on metamorphic conditions on a microscale, towards computation of continuous pressure–temperature-and relative time path in zoned metamorphic minerals not affected by post-crystallization diffusion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a new thermodynamic activity-composition model for di-trioctahedral chlorite in the system FeO–MgO–Al2O3–SiO2–H2O that is based on the Holland–Powell internally consistent thermodynamic data set. The model is formulated in terms of four linearly independent end-members, which are amesite, clinochlore, daphnite and sudoite. These account for the most important crystal-chemical substitutions in chlorite, the Fe–Mg, Tschermak and di-trioctahedral substitution. The ideal part of end-member activities is modeled with a mixing-on-site formalism, and non-ideality is described by a macroscopic symmetric (regular) formalism. The symmetric interaction parameters were calibrated using a set of 271 published chlorite analyses for which robust independent temperature estimates are available. In addition, adjustment of the standard state thermodynamic properties of sudoite was required to accurately reproduce experimental brackets involving sudoite. This new model was tested by calculating representative P–T sections for metasediments at low temperatures (<400 °C), in particular sudoite and chlorite bearing metapelites from Crete. Comparison between the calculated mineral assemblages and field data shows that the new model is able to predict the coexistence of chlorite and sudoite at low metamorphic temperatures. The predicted lower limit of the chloritoid stability field is also in better agreement with petrological observations. For practical applications to metamorphic and hydrothermal environments, two new semi-empirical chlorite geothermometers named Chl(1) and Chl(2) were calibrated based on the chlorite + quartz + water equilibrium (2 clinochlore + 3 sudoite = 4 amesite + 4 H2O + 7 quartz). The Chl(1) thermometer requires knowledge of the (Fe3+/ΣFe) ratio in chlorite and predicts correct temperatures for a range of redox conditions. The Chl(2) geothermometer which assumes that all iron in chlorite is ferrous has been applied to partially recrystallized detrital chlorite from the Zone houillère in the French Western Alps.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This article presents an empirical interdisciplinary study of an extensive participatory process that was carried out in 2004 in the recently established World Natural Heritage Site “Jungfrau–Aletsch– Bietschhorn” in the Swiss Alps. The study used qualitative and quantitative empirical methods of social science to address the question of success factors in establishing and concretizing a World Heritage Site. Current international scientific and policy debates agree that the most important success factors in defining pathways for nature conservation and protection are: linking development and conservation, involving multiple stakeholders, and applying participatory approaches. The results of the study indicate that linking development and conservation implies the need to extend the reach of negotiations beyond the area of conservation, and to develop both a regional perspective and a focus on sustainable regional development. In the process, regional and local stakeholders are less concerned with defining sustainability goals than elaborating strategies of sustainability, in particular defining the respective roles of the core sectors of society and economy. However, the study results also show that conflicting visions and perceptions of nature and landscape are important underlying currents in such negotiations. They differ significantly between various stakeholder categories and are an important cause of conflicts occurring at various stages of the participatory process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The general goal of this thesis is correlating observable properties of organic and metal-organic materials with their ground-state electron density distribution. In a long-term view, we expect to develop empirical or semi-empirical approaches to predict materials properties from the electron density of their building blocks, thus allowing to rationally engineering molecular materials from their constituent subunits, such as their functional groups. In particular, we have focused on linear optical properties of naturally occurring amino acids and their organic and metal-organic derivatives, and on magnetic properties of metal-organic frameworks. For analysing the optical properties and the magnetic behaviour of the molecular or sub-molecular building blocks in materials, we mostly used the more traditional QTAIM partitioning scheme of the molecular or crystalline electron densities, however, we have also investigated a new approach, namely, X-ray Constrained Extremely Localized Molecular Orbitals (XC-ELMO), that can be used in future to extracted the electron densities of crystal subunits. With the purpose of rationally engineering linear optical materials, we have calculated atomic and functional group polarizabilities of amino acid molecules, their hydrogen-bonded aggregates and their metal-organic frameworks. This has enabled the identification of the most efficient functional groups, able to build-up larger electric susceptibilities in crystals, as well as the quantification of the role played by intermolecular interactions and coordinative bonds on modifying the polarizability of the isolated building blocks. Furthermore, we analysed the dependence of the polarizabilities on the one-electron basis set and the many-electron Hamiltonian. This is useful for selecting the most efficient level of theory to estimate susceptibilities of molecular-based materials. With the purpose of rationally design molecular magnetic materials, we have investigated the electron density distributions and the magnetism of two copper(II) pyrazine nitrate metal-organic polymers. High-resolution X-ray diffraction and DFT calculations were used to characterize the magnetic exchange pathways and to establish relationships between the electron densities and the exchange-coupling constants. Moreover, molecular orbital and spin-density analyses were employed to understand the role of different magnetic exchange mechanisms in determining the bulk magnetic behaviour of these materials. As anticipated, we have finally investigated a modified version of the X-ray constrained wavefunction technique, XC-ELMOs, that is not only a useful tool for determination and analysis of experimental electron densities, but also enables one to derive transferable molecular orbitals strictly localized on atoms, bonds or functional groups. In future, we expect to use XC-ELMOs to predict materials properties of large systems, currently challenging to calculate from first-principles, such as macromolecules or polymers. Here, we point out advantages, needs and pitfalls of the technique. This work fulfils, at least partially, the prerequisites to understand materials properties of organic and metal-organic materials from the perspective of the electron density distribution of their building blocks. Empirical or semi-empirical evaluation of optical or magnetic properties from a preconceived assembling of building blocks could be extremely important for rationally design new materials, a field where accurate but expensive first-principles calculations are generally not used. This research could impact the community in the fields of crystal engineering, supramolecular chemistry and, of course, electron density analysis.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The concept of theory of mind (ToM), a hot topic in cognitive psychology for the past twenty-five years, has gained increasing importance in the fields of linguistics and pragmatics. However, even though the relationship between ToM and verbal communication is now recognized, the extent, causality and full implications of this connection remain mostly to be explored. This book presents a comprehensive discussion of the interface between language, communication, and theory of mind, and puts forward an innovative proposal regarding the role of discourse connectives for this interface. The proposed analysis of connectives is tested from the perspective of their acquisition, using empirical methods such as corpus analysis and controlled experiments, thus placing the study of connectives within the emerging framework of experimental pragmatics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this perspective article, we revise some of the empirical and semi-empirical strategies for predicting how hydrogen bonding affects molecular and atomic polarizabilities in aggregates. We use p-nitroaniline and hydrated oxalic acid as working examples to illustrate the enhancement of donor and acceptor functional-group polarizabilities and their anisotropy. This is significant for the evaluation of electrical susceptibilities in crystals; and the properties derived from them like the refractive indices.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

OBJECTIVE: Meta-analysis of studies of the accuracy of diagnostic tests currently uses a variety of methods. Statistically rigorous hierarchical models require expertise and sophisticated software. We assessed whether any of the simpler methods can in practice give adequately accurate and reliable results. STUDY DESIGN AND SETTING: We reviewed six methods for meta-analysis of diagnostic accuracy: four simple commonly used methods (simple pooling, separate random-effects meta-analyses of sensitivity and specificity, separate meta-analyses of positive and negative likelihood ratios, and the Littenberg-Moses summary receiver operating characteristic [ROC] curve) and two more statistically rigorous approaches using hierarchical models (bivariate random-effects meta-analysis and hierarchical summary ROC curve analysis). We applied the methods to data from a sample of eight systematic reviews chosen to illustrate a variety of patterns of results. RESULTS: In each meta-analysis, there was substantial heterogeneity between the results of different studies. Simple pooling of results gave misleading summary estimates of sensitivity and specificity in some meta-analyses, and the Littenberg-Moses method produced summary ROC curves that diverged from those produced by more rigorous methods in some situations. CONCLUSION: The closely related hierarchical summary ROC curve or bivariate models should be used as the standard method for meta-analysis of diagnostic accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To examine the extent of multiplicity of data in trial reports and to assess the impact of multiplicity on meta-analysis results. Design Empirical study on a cohort of Cochrane systematic reviews. Data sources All Cochrane systematic reviews published from issue 3 in 2006 to issue 2 in 2007 that presented a result as a standardised mean difference (SMD). We retrieved trial reports contributing to the first SMD result in each review, and downloaded review protocols. We used these SMDs to identify a specific outcome for each meta-analysis from its protocol. Review methods Reviews were eligible if SMD results were based on two to ten randomised trials and if protocols described the outcome. We excluded reviews if they only presented results of subgroup analyses. Based on review protocols and index outcomes, two observers independently extracted the data necessary to calculate SMDs from the original trial reports for any intervention group, time point, or outcome measure compatible with the protocol. From the extracted data, we used Monte Carlo simulations to calculate all possible SMDs for every meta-analysis. Results We identified 19 eligible meta-analyses (including 83 trials). Published review protocols often lacked information about which data to choose. Twenty-four (29%) trials reported data for multiple intervention groups, 30 (36%) reported data for multiple time points, and 29 (35%) reported the index outcome measured on multiple scales. In 18 meta-analyses, we found multiplicity of data in at least one trial report; the median difference between the smallest and largest SMD results within a meta-analysis was 0.40 standard deviation units (range 0.04 to 0.91). Conclusions Multiplicity of data can affect the findings of systematic reviews and meta-analyses. To reduce the risk of bias, reviews and meta-analyses should comply with prespecified protocols that clearly identify time points, intervention groups, and scales of interest.