960 resultados para First-principles calculations


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time after time… and aspect and mood. Over the last twenty five years, the study of time, aspect and - to a lesser extent - mood acquisition has enjoyed increasing popularity and a constant widening of its scope. In such a teeming field, what can be the contribution of this book? We believe that it is unique in several respects. First, this volume encompasses studies from different theoretical frameworks: functionalism vs generativism or function-based vs form-based approaches. It also brings together various sub-fields (first and second language acquisition, child and adult acquisition, bilingualism) that tend to evolve in parallel rather than learn from each other. A further originality is that it focuses on a wide range of typologically different languages, and features less studied languages such as Korean and Bulgarian. Finally, the book gathers some well-established scholars, young researchers, and even research students, in a rich inter-generational exchange, that ensures the survival but also the renewal and the refreshment of the discipline. The book at a glance The first part of the volume is devoted to the study of child language acquisition in monolingual, impaired and bilingual acquisition, while the second part focuses on adult learners. In this section, we will provide an overview of each chapter. The first study by Aviya Hacohen explores the acquisition of compositional telicity in Hebrew L1. Her psycholinguistic approach contributes valuable data to refine theoretical accounts. Through an innovating methodology, she gathers information from adults and children on the influence of definiteness, number, and the mass vs countable distinction on the constitution of a telic interpretation of the verb phrase. She notices that the notion of definiteness is mastered by children as young as 10, while the mass/count distinction does not appear before 10;7. However, this does not entail an adult-like use of telicity. She therefore concludes that beyond definiteness and noun type, pragmatics may play an important role in the derivation of Hebrew compositional telicity. For the second chapter we move from a Semitic language to a Slavic one. Milena Kuehnast focuses on the acquisition of negative imperatives in Bulgarian, a form that presents the specificity of being grammatical only with the imperfective form of the verb. The study examines how 40 Bulgarian children distributed in two age-groups (15 between 2;11-3;11, and 25 between 4;00 and 5;00) develop with respect to the acquisition of imperfective viewpoints, and the use of imperfective morphology. It shows an evolution in the recourse to expression of force in the use of negative imperatives, as well as the influence of morphological complexity on the successful production of forms. With Yi-An Lin’s study, we concentrate both on another type of informant and of framework. Indeed, he studies the production of children suffering from Specific Language Impairment (SLI), a developmental language disorder the causes of which exclude cognitive impairment, psycho-emotional disturbance, and motor-articulatory disorders. Using the Leonard corpus in CLAN, Lin aims to test two competing accounts of SLI (the Agreement and Tense Omission Model [ATOM] and his own Phonetic Form Deficit Model [PFDM]) that conflicts on the role attributed to spellout in the impairment. Spellout is the point at which the Computational System for Human Language (CHL) passes over the most recently derived part of the derivation to the interface components, Phonetic Form (PF) and Logical Form (LF). ATOM claims that SLI sufferers have a deficit in their syntactic representation while PFDM suggests that the problem only occurs at the spellout level. After studying the corpus from the point of view of tense / agreement marking, case marking, argument-movement and auxiliary inversion, Lin finds further support for his model. Olga Gupol, Susan Rohstein and Sharon Armon-Lotem’s chapter offers a welcome bridge between child language acquisition and multilingualism. Their study explores the influence of intensive exposure to L2 Hebrew on the development of L1 Russian tense and aspect morphology through an elicited narrative. Their informants are 40 Russian-Hebrew sequential bilingual children distributed in two age groups 4;0 – 4;11 and 7;0 - 8;0. They come to the conclusion that bilingual children anchor their narratives in perfective like monolinguals. However, while aware of grammatical aspect, bilinguals lack the full form-function mapping and tend to overgeneralize the imperfective on the principles of simplicity (as imperfective are the least morphologically marked forms), universality (as it covers more functions) and interference. Rafael Salaberry opens the second section on foreign language learners. In his contribution, he reflects on the difficulty L2 learners of Spanish encounter when it comes to distinguishing between iterativity (conveyed with the use of the preterite) and habituality (expressed through the imperfect). He examines in turn the theoretical views that see, on the one hand, habituality as part of grammatical knowledge and iterativity as pragmatic knowledge, and on the other hand both habituality and iterativity as grammatical knowledge. He comes to the conclusion that the use of preterite as a default past tense marker may explain the impoverished system of aspectual distinctions, not only at beginners but also at advanced levels, which may indicate that the system is differentially represented among L1 and L2 speakers. Acquiring the vast array of functions conveyed by a form is therefore no mean feat, as confirmed by the next study. Based on the prototype theory, Kathleen Bardovi-Harlig’s chapter focuses on the development of the progressive in L2 English. It opens with an overview of the functions of the progressive in English. Then, a review of acquisition research on the progressive in English and other languages is provided. The bulk of the chapter reports on a longitudinal study of 16 learners of L2 English and shows how their use of the progressive expands from the prototypical uses of process and continuousness to the less prototypical uses of repetition and future. The study concludes that the progressive spreads in interlanguage in accordance with prototype accounts. However, it suggests additional stages, not predicted by the Aspect Hypothesis, in the development from activities and accomplishments at least for the meaning of repeatedness. A similar theoretical framework is adopted in the following chapter, but it deals with a lesser studied language. Hyun-Jin Kim revisits the claims of the Aspect Hypothesis in relation to the acquisition of L2 Korean by two L1 English learners. Inspired by studies on L2 Japanese, she focuses on the emergence and spread of the past / perfective marker ¬–ess- and the progressive – ko iss- in the interlanguage of her informants throughout their third and fourth semesters of study. The data collected through six sessions of conversational interviews and picture description tasks seem to support the Aspect Hypothesis. Indeed learners show a strong association between past tense and accomplishments / achievements at the start and a gradual extension to other types; a limited use of past / perfective marker with states and an affinity of progressive with activities / accomplishments and later achievements. In addition, - ko iss– moves from progressive to resultative in the specific category of Korean verbs meaning wear / carry. While the previous contributions focus on function, Evgeniya Sergeeva and Jean-Pierre Chevrot’s is interested in form. The authors explore the acquisition of verbal morphology in L2 French by 30 instructed native speakers of Russian distributed in a low and high levels. They use an elicitation task for verbs with different models of stem alternation and study how token frequency and base forms influence stem selection. The analysis shows that frequency affects correct production, especially among learners with high proficiency. As for substitution errors, it appears that forms with a simple structure are systematically more frequent than the target form they replace. When a complex form serves as a substitute, it is more frequent only when it is replacing another complex form. As regards the use of base forms, the 3rd person singular of the present – and to some extent the infinitive – play this role in the corpus. The authors therefore conclude that the processing of surface forms can be influenced positively or negatively by the frequency of the target forms and of other competing stems, and by the proximity of the target stem to a base form. Finally, Martin Howard’s contribution takes up the challenge of focusing on the poorer relation of the TAM system. On the basis of L2 French data obtained through sociolinguistic interviews, he studies the expression of futurity, conditional and subjunctive in three groups of university learners with classroom teaching only (two or three years of university teaching) or with a mixture of classroom teaching and naturalistic exposure (2 years at University + 1 year abroad). An analysis of relative frequencies leads him to suggest a continuum of use going from futurate present to conditional with past hypothetic conditional clauses in si, which needs to be confirmed by further studies. Acknowledgements The present volume was inspired by the conference Acquisition of Tense – Aspect – Mood in First and Second Language held on 9th and 10th February 2008 at Aston University (Birmingham, UK) where over 40 delegates from four continents and over a dozen countries met for lively and enjoyable discussions. This collection of papers was double peer-reviewed by an international scientific committee made of Kathleen Bardovi-Harlig (Indiana University), Christine Bozier (Lund Universitet), Alex Housen (Vrije Universiteit Brussel), Martin Howard (University College Cork), Florence Myles (Newcastle University), Urszula Paprocka (Catholic University of Lublin), †Clive Perdue (Université Paris 8), Michel Pierrard (Vrije Universiteit Brussel), Rafael Salaberry (University of Texas at Austin), Suzanne Schlyter (Lund Universitet), Richard Towell (Salford University), and Daniel Véronique (Université d’Aix-en-Provence). We are very much indebted to that scientific committee for their insightful input at each step of the project. We are also thankful for the financial support of the Association for French Language Studies through its workshop grant, and to the Aston Modern Languages Research Foundation for funding the proofreading of the manuscript.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the first time for the model of real-world forward-pumped fibre Raman amplifier with the randomly varying birefringence, the stochastic calculations have been done numerically based on the Kloeden-Platen-Schurz algorithm. The results obtained for the averaged gain and gain fluctuations as a function of polarization mode dispersion (PMD) parameter agree quantitatively with the results of previously developed analytical model. Simultaneously, the direct numerical simulations demonstrate an increased stochastisation (maximum in averaged gain variation) within the region of the polarization mode dispersion parameter of 0.1÷0.3 ps/km1/2. The results give an insight into margins of applicability of a generic multi-scale technique widely used to derive coupled Manakov equations and allow generalizing analytic model with accounting for pump depletion, group-delay dispersion and Kerr-nonlinearity that is of great interest for development of the high-transmission-rates optical networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fejlett társadalmak egészségügyi szolgáltató rendszerei napjainkban kettős kihívással néznek szembe: miközben a társadalom a szolgáltatási színvonal emelkedését, a hibák számának a csökkenését várja el, addig a költségvetési terhek miatt a költségcsökkentés is feltétlenül szükséges. Ez a kihívás nagyságában összevethető azzal, amellyel az USA autóipara nézett szembe az 1970-es évektől. A megoldást az autóipar esetében a konkurens „lean” menedzsment elvek és eszközök megértése és alkalmazása jelentette. A tanulmány arra keresi a választ, hogy vajon lehetséges-e ennek a megoldásnak az alkalmazása az egészségügy esetében is. A cikk az egészségügy problémájának bemutatása után tárgyalja a lean menedzsment kialakulását és hogy milyen módon került köztudatba. A tanulmány második felében a szakirodalomban fellelhető, a témával kapcsolatos tapasztalatokat foglalja össze, majd levonja a következtetéseket. = In developed societies healthcare service systems are facing double challenge; society expects service level to rise and the number of mistakes to drop, but at the same time, because of the overloaded budgets, cutting cost is also absolutely necessary. This challenge compares to the one the US automotive industry was facing in the 1970-s. In case of the automotive industry the solution was the comprehension and application of the principles and the tools of lean management. This study aims to answer the question whether it is possible to apply this solution also in the case of the healthcare system. The article first introduces the problems in the healthcare system, than describes the formation of lean management concept and its wide spread. The second half of the study summarizes the available knowledge in the literature and drives conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lutein is a principal constituent of the human macular pigment. This study is composed of two projects. The first studies the conformational geometries of lutein and its potential adaptability in biological systems. The second is a study of the response of human subjects to lutein supplements. Using semi-empirical parametric method 3 (PM3) and density functional theory with the B3LYP/6-31G* basis set, the relative energies of s- cis conformers of lutein were determined. All 512 s-cis conformers were calculated with PM3. A smaller, representative group was also studied using density functional theory. PM3 results were correlated systematically to B3LYP values and this enables the results to be calibrated. The relative energies of the conformers range from 1-30 kcal/mole, and many are dynamically accessible at normal temperatures. Four commercial formulations containing lutein were studied. The serum and macular pigment (MP) responses of human subjects to these lutein supplements with doses of 9 or 20 mg/day were measured, relative to a placebo, over a six month period. In each instance, lutein levels in serum increased and correlated with MP increases. The results demonstrate that responses are significantly dependent upon formulation and that components other than lutein have an important influence serum response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Free energy calculations are a computational method for determining thermodynamic quantities, such as free energies of binding, via simulation.

Currently, due to computational and algorithmic limitations, free energy calculations are limited in scope.

In this work, we propose two methods for improving the efficiency of free energy calculations.

First, we expand the state space of alchemical intermediates, and show that this expansion enables us to calculate free energies along lower variance paths.

We use Q-learning, a reinforcement learning technique, to discover and optimize paths at low computational cost.

Second, we reduce the cost of sampling along a given path by using sequential Monte Carlo samplers.

We develop a new free energy estimator, pCrooks (pairwise Crooks), a variant on the Crooks fluctuation theorem (CFT), which enables decomposition of the variance of the free energy estimate for discrete paths, while retaining beneficial characteristics of CFT.

Combining these two advancements, we show that for some test models, optimal expanded-space paths have a nearly 80% reduction in variance relative to the standard path.

Additionally, our free energy estimator converges at a more consistent rate and on average 1.8 times faster when we enable path searching, even when the cost of path discovery and refinement is considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The observation chart is for many health professionals (HPs) the primary source of objective information relating to the health of a patient. Information Systems (IS) research has demonstrated the positive impact of good interface design on decision making and it is logical that good observation chart design can positively impact healthcare decision making. Despite the potential for good observation chart design, there is a paucity of observation chart design literature, with the primary source of literature leveraging Human Computer Interaction (HCI) literature to design better charts. While this approach has been successful, this design approach introduces a gap between understanding of the tasks performed by HPs when using charts and the design features implemented in the chart. Good IS allow for the collection and manipulation of data so that it can be presented in a timely manner that support specific tasks. Good interface design should therefore consider the specific tasks being performed prior to designing the interface. This research adopts a Design Science Research (DSR) approach to formalise a framework of design principles that incorporates knowledge of the tasks performed by HPs when using observation charts and knowledge pertaining to visual representations of data and semiology of graphics. This research is presented in three phases, the initial two phases seek to discover and formalise design knowledge embedded in two situated observation charts: the paper-based NEWS chart developed by the Health Service Executive in Ireland and the electronically generated eNEWS chart developed by the Health Information Systems Research Centre in University College Cork. A comparative evaluation of each chart is also presented in the respective phases. Throughout each of these phases, tentative versions of a design framework for electronic vital sign observation charts are presented, with each subsequent iteration of the framework (versions Alpha, Beta, V0.1 and V1.0) representing a refinement of the design knowledge. The design framework will be named the framework for the Retrospective Evaluation of Vital Sign Information from Early Warning Systems (REVIEWS). Phase 3 of the research presents the deductive process for designing and implementing V0.1 of the framework, with evaluation of the instantiation allowing for the final iteration V1.0 of the framework. This study makes a number of contributions to academic research. First the research demonstrates that the cognitive tasks performed by nurses during clinical reasoning can be supported through good observation chart design. Secondly the research establishes the utility of electronic vital sign observation charts in terms of supporting the cognitive tasks performed by nurses during clinical reasoning. Third the framework for REVIEWS represents a comprehensive set of design principles which if applied to chart design will improve the usefulness of the chart in terms of supporting clinical reasoning. Fourth the electronic observation chart that emerges from this research is demonstrated to be significantly more useful than previously designed charts and represents a significant contribution to practice. Finally the research presents a research design that employs a combination of inductive and deductive design activities to iterate on the design of situated artefacts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The authors explored whether a testing effect occurs not only for retention of facts but also for application of principles and procedures. For that purpose, 38 high school students either repeatedly studied a text on probability calculations or studied the text, took a test on the content, restudied the text, and finally took the test a second time. Results show that testing not only leads to better retention of facts than restudying, but also to better application of acquired knowledge (i.e., principles and procedures) in high school statistics. In other words, testing seems not only to benefit fact retention, but also positively affects deeper learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two stage approach to performing ab initio calculations on medium and large sized molecules is described. The first step is to perform SCF calculations on small molecules or molecular fragments using the OPIT Program. This employs a small basis set of spherical and p-type Gaussian functions. The Gaussian functions can be identified very closely with atomic cores, bond pairs, lone pairs, etc. The position and exponent of any of the Gaussian functions can be varied by OPIT to produce a small but fully optimised basis set. The second stage is the molecular fragments method. As an example of this, Gaussian exponents and distances are taken from an OPIT calculation on ethylene and used unchanged in a single SCF calculation on benzene. Approximate ab initio calculations of this type give much useful information and are often preferable to semi-empirical approaches, since the nature of the approximations involved is much better defined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigates pro-poor tourism (PPT), which has only been considered in a third world context, in a first world country, determining whether PPT principles are being used to alleviate poverty in a developed location, Glasgow Govan, in Scotland. The research develops and applies a new PPT principles tool to regeneration projects in the area and reveals a significant level of PPT application there. The findings suggest that PPT can be an over-complication of a common sense development approach that any responsible government should promote. The results also question the validity of community based tourism initiatives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last three decades, there has been a broad academic and industrial interest in conjugated polymers as semiconducting materials for organic electronics. Their applications in polymer light-emitting diodes (PLEDs), polymer solar cells (PSCs), and organic field-effect transistors (OFETs) offer opportunities for the resolution of energy issues as well as the development of display and information technologies1. Conjugated polymers provide several advantages including low cost, light weight, good flexibility, as well as solubility which make them readily processed and easily printed, removing the conventional photolithography for patterning2. A large library of polymer semiconductors have been synthesized and investigated with different building blocks, such as acenes or thiophene and derivatives, which have been employed to design new materials according to individual demands for specific applications. To design ideal conjugated polymers for specific applications, some general principles should be taken into account, including (i) side chains (ii) molecular weights, (iii) band gap and HOMO and LUMO energy levels, and (iv) suited morphology.3-6 The aim of this study is to elucidate the impact that substitution exerts on the molecular and electronic structure of π-conjugated polymers with outstanding performances in organic electronic devices. Different configurations of the π-conjugated backbones are analyzed: (i) donor-acceptor configuration, (ii) 1D lineal or 2D branched conjugated backbones, and (iii) encapsulated polymers (see Figure 1). Our combined vibrational spectroscopy and DFT study shows that small changes in the substitution pattern and in the molecular configuration have a strong impact on the electronic characteristics of these polymers. We hope this study can advance useful structure-property relationships of conjugated polymers and guide the design of new materials for organic electronic applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most exciting discoveries in astrophysics of the last last decade is of the sheer diversity of planetary systems. These include "hot Jupiters", giant planets so close to their host stars that they orbit once every few days; "Super-Earths", planets with sizes intermediate to those of Earth and Neptune, of which no analogs exist in our own solar system; multi-planet systems with planets smaller than Mars to larger than Jupiter; planets orbiting binary stars; free-floating planets flying through the emptiness of space without any star; even planets orbiting pulsars. Despite these remarkable discoveries, the field is still young, and there are many areas about which precious little is known. In particular, we don't know the planets orbiting Sun-like stars nearest to our own solar system, and we know very little about the compositions of extrasolar planets. This thesis provides developments in those directions, through two instrumentation projects.

The first chapter of this thesis concerns detecting planets in the Solar neighborhood using precision stellar radial velocities, also known as the Doppler technique. We present an analysis determining the most efficient way to detect planets considering factors such as spectral type, wavelengths of observation, spectrograph resolution, observing time, and instrumental sensitivity. We show that G and K dwarfs observed at 400-600 nm are the best targets for surveys complete down to a given planet mass and out to a specified orbital period. Overall we find that M dwarfs observed at 700-800 nm are the best targets for habitable-zone planets, particularly when including the effects of systematic noise floors caused by instrumental imperfections. Somewhat surprisingly, we demonstrate that a modestly sized observatory, with a dedicated observing program, is up to the task of discovering such planets.

We present just such an observatory in the second chapter, called the "MINiature Exoplanet Radial Velocity Array," or MINERVA. We describe the design, which uses a novel multi-aperture approach to increase stability and performance through lower system etendue, as well as keeping costs and time to deployment down. We present calculations of the expected planet yield, and data showing the system performance from our testing and development of the system at Caltech's campus. We also present the motivation, design, and performance of a fiber coupling system for the array, critical for efficiently and reliably bringing light from the telescopes to the spectrograph. We finish by presenting the current status of MINERVA, operational at Mt. Hopkins observatory in Arizona.

The second part of this thesis concerns a very different method of planet detection, direct imaging, which involves discovery and characterization of planets by collecting and analyzing their light. Directly analyzing planetary light is the most promising way to study their atmospheres, formation histories, and compositions. Direct imaging is extremely challenging, as it requires a high performance adaptive optics system to unblur the point-spread function of the parent star through the atmosphere, a coronagraph to suppress stellar diffraction, and image post-processing to remove non-common path "speckle" aberrations that can overwhelm any planetary companions.

To this end, we present the "Stellar Double Coronagraph," or SDC, a flexible coronagraphic platform for use with the 200" Hale telescope. It has two focal and pupil planes, allowing for a number of different observing modes, including multiple vortex phase masks in series for improved contrast and inner working angle behind the obscured aperture of the telescope. We present the motivation, design, performance, and data reduction pipeline of the instrument. In the following chapter, we present some early science results, including the first image of a companion to the star delta Andromeda, which had been previously hypothesized but never seen.

A further chapter presents a wavefront control code developed for the instrument, using the technique of "speckle nulling," which can remove optical aberrations from the system using the deformable mirror of the adaptive optics system. This code allows for improved contrast and inner working angles, and was written in a modular style so as to be portable to other high contrast imaging platforms. We present its performance on optical, near-infrared, and thermal infrared instruments on the Palomar and Keck telescopes, showing how it can improve contrasts by a factor of a few in less than ten iterations.

One of the large challenges in direct imaging is sensing and correcting the electric field in the focal plane to remove scattered light that can be much brighter than any planets. In the last chapter, we present a new method of focal-plane wavefront sensing, combining a coronagraph with a simple phase-shifting interferometer. We present its design and implementation on the Stellar Double Coronagraph, demonstrating its ability to create regions of high contrast by measuring and correcting for optical aberrations in the focal plane. Finally, we derive how it is possible to use the same hardware to distinguish companions from speckle errors using the principles of optical coherence. We present results observing the brown dwarf HD 49197b, demonstrating the ability to detect it despite it being buried in the speckle noise floor. We believe this is the first detection of a substellar companion using the coherence properties of light.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A rapid and efficient method to identify the weak points of the complex chemical structure of low band gap (LBG) polymers, designed for efficient solar cells, when submitted to light exposure is reported. This tool combines Electron Paramagnetic Resonance (EPR) using the 'spin trapping method' coupled with density functional theory modelling (DFT). First, the nature of the short life-time radicals formed during the early-stages of photo-degradation processes are determined by a spin-trapping technique. Two kinds of short life-time radical (R and R′O) are formed after 'short-duration' illumination in an inert atmosphere and in ambient air, respectively. Second, simulation allows the identification of the chemical structures of these radicals revealing the most probable photochemical process, namely homolytical scission between the Si atom of the conjugated skeleton and its pendent side-chains. Finally, DFT calculations confirm the homolytical cleavage observed by EPR, as well as the presence of a group that is highly susceptible to photooxidative attack. Therefore, the synergetic coupling of a spin trapping method with DFT calculations is shown to be a rapid and efficient method for providing unprecedented information on photochemical mechanisms. This approach will allow the design of LBG polymers without the need to trial the material within actual solar cell devices, an often long and costly screening procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the results of domestic Chinese undergraduate engineering course taught by international Australasian teaching staff. The project is a part of a teaching collaboration between Deakin University and Wuhan University of Science and Technology. The cohort of students from Wuhan was a freshman undergraduate engineering course in mechanical engineering. The particular subject was a freshman engineering-materials course taught in English. The course covered an introduction to material-science principles and practices. A survey was used for evaluating student perceptions. It is aimed that this study will help academics from Deakin University to better understand student experiences, and to identify the current challenges and barriers faced in student learning. Analysis of the survey has shown that 90% of students agreed that they were motivated to learn and achieve the learning goals through this collaborative program. Around 90% of students found that group-based practical activities were helpful in achieving learning goals. Overall, 90% of students strongly agreed they were satisfied with the method of teaching.