976 resultados para optimum currency area theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The controlled from distance teaching (DT) in the system of technical education has a row of features: complication of informative content, necessity of development of simulation models and trainers for conducting of practical and laboratory employments, conducting of knowledge diagnostics on the basis of mathematical-based algorithms, organization of execution collective projects of the applied setting. For development of the process of teaching bases of fundamental discipline control system Theory of automatic control (TAC) the combined approach of optimum combination of existent programmatic instruments of support was chosen DT and own developments. The system DT TAC included: controlled from distance course (DC) of TAC, site of virtual laboratory practical works in LAB.TAC and students knowledge remote diagnostic system d-tester.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is (1) to highlight some recent and heretofore unpublished results in the theory of multiplier sequences and (2) to survey some open problems in this area of research. For the sake of clarity of exposition, we have grouped the problems in three subsections, although several of the problems are interrelated. For the reader’s convenience, we have included the pertinent definitions, cited references and related results, and in several instances, elucidated the problems by examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When export and import is connected with output of basic production, and criterion functional represents a final state of economy, the generalization of classical qualitative results of the main-line theory on a case of dynamic input-output balance optimization model for open economy is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the specific area of software engineering (SE) for self-adaptive systems (SASs) there is a growing research awareness about the synergy between SE and artificial intelligence (AI). However, just few significant results have been published so far. In this paper, we propose a novel and formal Bayesian definition of surprise as the basis for quantitative analysis to measure degrees of uncertainty and deviations of self-adaptive systems from normal behavior. A surprise measures how observed data affects the models or assumptions of the world during runtime. The key idea is that a "surprising" event can be defined as one that causes a large divergence between the belief distributions prior to and posterior to the event occurring. In such a case the system may decide either to adapt accordingly or to flag that an abnormal situation is happening. In this paper, we discuss possible applications of Bayesian theory of surprise for the case of self-adaptive systems using Bayesian dynamic decision networks. Copyright © 2014 ACM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To evaluate the effect of reducing the number of visual acuity measurements made in a defocus curve on the quality of data quantified. Setting: Midland Eye, Solihull, United Kingdom. Design: Evaluation of a technique. Methods: Defocus curves were constructed by measuring visual acuity on a distance logMAR letter chart, randomizing the test letters between lens presentations. The lens powers evaluated ranged between +1.50 diopters (D) and -5.00 D in 0.50 D steps, which were also presented in a randomized order. Defocus curves were measured binocularly with the Tecnis diffractive, Rezoom refractive, Lentis rotationally asymmetric segmented (+3.00 D addition [add]), and Finevision trifocal multifocal intraocular lenses (IOLs) implanted bilaterally, and also for the diffractive IOL and refractive or rotationally asymmetric segmented (+3.00 D and +1.50 D adds) multifocal IOLs implanted contralaterally. Relative and absolute range of clear-focus metrics and area metrics were calculated for curves fitted using 0.50 D, 1.00 D, and 1.50 D steps and a near add-specific profile (ie, distance, half the near add, and the full near-add powers). Results: A significant difference in simulated results was found in at least 1 of the relative or absolute range of clear-focus or area metrics for each of the multifocal designs examined when the defocus-curve step size was increased (P<.05). Conclusion: Faster methods of capturing defocus curves from multifocal IOL designs appear to distort the metric results and are therefore not valid. Financial Disclosure: No author has a financial or proprietary interest in any material or method mentioned. © 2013 ASCRS and ESCRS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dry eye disease is a common clinical condition whose aetiology and management challenges clinicians and researchers alike. Practitioners have a number of dry eye tests available to clinically assess dry eye disease, in order to treat their patients effectively and successfully. This thesis set out to determine the most relevant and successful key tests for dry eye disease diagnosis/ management. There has been very little research on determining the most effective treatment options for these patients; therefore a randomised controlled study was conducted in order to see how different artificial treatments perform compared to each other, whether the preferred treatment could have been predicted from their ocular clinical assessment, and if the preferred treatment subjectively related to the greatest improvement in ocular physiology and tear film stability. This research has found: 1. From the plethora of ocular the tear tests available to utilise in clinical practice, the tear stability tests as measured by the non-invasive tear break (NITBUT) up time and invasive tear break up time (NaFL TBUT) are strongly correlated. The tear volume tests are also related as measured by the phenol red thread (PRT) and tear meniscus height (TMH). Lid Parallel Conjunctival Folds (LIPCOF) and conjunctival staining are significantly correlated to one another. Symptomology and osmolarity were also found to be important tests in order to assess for dry eye. 2. Artificial tear supplements do work for ocular comfort, as well as the ocular surface as observed by conjunctival staining and the reduction LIPCOF. There is no strong evidence of one type of artificial tear supplement being more effective than others, and the data suggest that these improvements are more due to the time than the specific drops. 3. When trying to predict patient preference for artificial tears from baseline measurements, the individual category of artificial tear supplements appeared to have an improvement in at least 1 tear metric. Undoubtedly, from the study the patients preferred artificial tear supplements’ were rated much higher than the other three drops used in the study and their subjective responses were statistically significant than the signs. 4. Patients are also willing to pay for a community dry eye service in their area of £17. In conclusion, the dry eye tests conducted in the study correlate with one another and with the symptoms reported by the patient. Artificial tears do make a difference objectively as well as subjectively. There is no optimum artificial treatment for dry eye, however regular consistent use of artificial eye drops will improve the ocular surface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Presbyopia is a consequence of ageing and is therefore increasing inprevalence due to an increase in the ageing population. Of the many methods available to manage presbyopia, the use of contact lenses is indeed a tried and tested reversible option for those wishing to be spectacle free. Contact lens options to correct presbyopia include multifocal contact lenses and monovision.Several options have been available for many years with available guides to help choose multifocal contact lenses. However there is no comprehensive way to help the practitioner selecting the best option for an individual. An examination of the simplest way of predicting the most suitable multifocal lens for a patient will only enhance and add to the current evidence available. The purpose of the study was to determine the current use of presbyopic correction modalities in an optometric practice population in the UK and to evaluate and compare the optical performance of four silicone hydrogel soft multifocal contact lenses and to compare multifocal performance with contact lens monovision. The presbyopic practice cohort principal forms of refractive correction were distance spectacles (with near and intermediate vision providedby a variety of other forms of correction), varifocal spectacles and unaided distance with reading spectacles, with few patients wearing contact lenses as their primary correction modality. The results of the multifocal contact lens randomised controlled trial showed that there were only minor differences in corneal physiology between the lens options. Visual acuity differences were observed for distance targets, but only for low contrast letters and under mesopic lighting conditions. At closer distances between 20cm and 67cm, the defocus curves demonstrated that there were significant differences in acuity between lens designs (p < 0.001) and there was an interaction between the lens design and the level of defocus (p < 0.001). None of the lenses showed a clear near addition, perhaps due to their more aspheric rather than zoned design. As expected, stereoacuity was reduced with monovision compared with the multifocal contact lens designs, although there were some differences between the multifocal lens designs (p < 0.05). Reading speed did not differ between lens designs (F = 1.082, p = 0.368), whereas there was a significant difference in critical print size (F = 7.543, p < 0.001). Glare was quantified with a novel halometer and halo size was found to significantly differ between lenses(F = 4.101, p = 0.004). The rating of iPhone image clarity was significantly different between presbyopic corrections (p = 0.002) as was the Near Acuity Visual Questionnaire (NAVQ) rating of near performance (F = 3.730, p = 0.007).The pupil size did not alter with contact lens design (F = 1.614, p = 0.175), but was larger in the dominant eye (F = 5.489, p = 0.025). Pupil decentration relative to the optical axis did not alter with contact lens design (F = 0.777, p =0.542), but was also greater in the dominant eye (F = 9.917, p = 0.003). It was interesting to note that there was no difference in spherical aberrations induced between the contact lens designs (p > 0.05), with eye dominance (p > 0.05) oroptical component (ocular, corneal or internal: p > 0.05). In terms of subjective patient lens preference, 10 patients preferred monovision,12 Biofinity multifocal lens, 7 Purevision 2 for Presbyopia, 4 AirOptix multifocal and 2 Oasys multifocal contact lenses. However, there were no differences in demographic factors relating to lifestyle or personality, or physiological characteristics such as pupil size or ocular aberrations as measured at baseline,which would allow a practitioner to identify which lens modality the patient would prefer. In terms of the performance of patients with their preferred lens, it emerged that Biofinity multifocal lens preferring patients had a better high contrast acuity under photopic conditions, maintained their reading speed at smaller print sizes and subjectively rated iPhone clarity as better with this lens compared with the other lens designs trialled. Patients who preferred monovision had a lower acuity across a range of distances and a larger area of glare than those patients preferring other lens designs that was unexplained by the clinical metrics measured. However, it seemed that a complex interaction of aberrations may drive lens preference. New clinical tests or more diverse lens designs which may allow practitioners to prescribe patients the presbyopic contact lens option that will work best for them first time remains a hope for the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurements of area summation for luminance-modulated stimuli are typically confounded by variations in sensitivity across the retina. Recently we conducted a detailed analysis of sensitivity across the visual field (Baldwin et al, 2012) and found it to be well-described by a bilinear “witch’s hat” function: sensitivity declines rapidly over the first 8 cycles or so, more gently thereafter. Here we multiplied luminance-modulated stimuli (4 c/deg gratings and “Swiss cheeses”) by the inverse of the witch’s hat function to compensate for the inhomogeneity. This revealed summation functions that were straight lines (on double log axes) with a slope of -1/4 extending to ≥33 cycles, demonstrating fourth-root summation of contrast over a wider area than has previously been reported for the central retina. Fourth-root summation is typically attributed to probability summation, but recent studies have rejected that interpretation in favour of a noisy energy model that performs local square-law transduction of the signal, adds noise at each location of the target and then sums over signal area. Modelling shows our results to be consistent with a wide field application of such a contrast integrator. We reject a probability summation model, a quadratic model and a matched template model of our results under the assumptions of signal detection theory. We also reject the high threshold theory of contrast detection under the assumption of probability summation over area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Leadership is one of the most examined factors in relation to understanding employee wellbeing and performance. While there are disparate approaches to studying leadership, they share a common assumption that perceptions of a leader's behavior determine reactions to the leader. The concept of leadership perception is poorly understood in most theoretical approaches. To address this, we propose that there are many benefits from examining leadership perceptions as an attitude towards the leader. In this review, we show how research examining a number of aspects of attitudes (content, structure and function) can advance understanding of leadership perceptions and how these affect work-related outcomes. Such a perspective provides a more multi-faceted understanding of leadership perceptions than previously envisaged and this can provide a more detailed understanding of how such perceptions affect outcomes. In addition, we examine some of the main theoretical and methodological implications of viewing leadership perceptions as attitudes to the wider leadership area. The cross-fertilization of research from the attitudes literature to understanding leadership perceptions provides new insights into leadership processes and potential avenues for further research. (C) 2015 Elsevier Inc. All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

While analysis of the effect which education and migration have on development is neither clear cut, nor obvious, regimes such as those of Jamaica have traditionally placed great emphasis on development through education at all levels. The process of human resource development and the accumulation of human capital is intended to unlock the door to modernization. Nevertheless, our findings indicate a considerable loss of professional and skilled personnel -- the same group that embody a disproportionate amount of educational expenditure relative to the population. Insofar as planning is concerned this migration represents a negative factor. The developing country of Jamaica is unintentionally supplying the developed world with an "annual gift" of human capital which its economy cannot afford. The major issue becomes: to what extent can any government "protect" its investments by restricting movements of capital and people. The general assumption of this paper is that the question of human rights cannot be ignored especially in democracies (which Jamaica decidedly is), where movement is seen as an ingrained human right. During the 1970s and 1980s, Jamaica and the Caribbean as a whole has lost much through intellectual capital migrations. Yet brains may also die in their own environment, if deprived the ability to create their own criteria and goals. Forcing people to stay with their money and know-how may only serve to produce and economic environment overgrown with weeds of lethargy, indolence and mediocrity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maternity nursing practice is changing across Canada with the movement toward becoming “baby friendly.” The World Health Organization (WHO) recommends the Baby-Friendly Hospital Initiative (BFHI) as a standard of care in hospitals worldwide. Very little research has been conducted with nurses to explore the impact of the initiative on nursing practice. The purpose of this study, therefore, was to examine the process of implementing the BFHI for nurses. The study was carried out using Corbin and Strauss’s method of grounded theory. Theoretical sampling was employed, which resulted in recruiting and interviewing 13 registered nurses whose area of employment included neonatal intensive care, postpartum, and labour and delivery. The data analysis revealed a central category of resisting the BFHI. All of the nurses disagreed with some of the 10 steps to becoming a baby-friendly hospital as outlined by the WHO. Participants questioned the science and safety of aspects of the BFHI. Also, participants indicated that the implementation of this program did not substantially change their nursing practice. They empathized with new mothers and anticipated being collectively reprimanded by management should they not follow the initiative. Five conditions influenced their responses to the initiative, which were (a) an awareness of a pro-breastfeeding culture, (b) imposition of the BFHI, (c) knowledge of the health benefits of breastfeeding, (d) experiential knowledge of infant feeding, and (e) the belief in the autonomy of mothers to decide about infant feeding. The identified outcomes were moral distress and division between nurses. The study findings could guide decision making concerning the implementation of the BFHI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the world population continues to grow past seven billion people and global challenges continue to persist including resource availability, biodiversity loss, climate change and human well-being, a new science is required that can address the integrated nature of these challenges and the multiple scales on which they are manifest. Sustainability science has emerged to fill this role. In the fifteen years since it was first called for in the pages of Science, it has rapidly matured, however its place in the history of science and the way it is practiced today must be continually evaluated. In Part I, two chapters address this theoretical and practical grounding. Part II transitions to the applied practice of sustainability science in addressing the urban heat island (UHI) challenge wherein the climate of urban areas are warmer than their surrounding rural environs. The UHI has become increasingly important within the study of earth sciences given the increased focus on climate change and as the balance of humans now live in urban areas.

In Chapter 2 a novel contribution to the historical context of sustainability is argued. Sustainability as a concept characterizing the relationship between humans and nature emerged in the mid to late 20th century as a response to findings used to also characterize the Anthropocene. Emerging from the human-nature relationships that came before it, evidence is provided that suggests Sustainability was enabled by technology and a reorientation of world-view and is unique in its global boundary, systematic approach and ambition for both well being and the continued availability of resources and Earth system function. Sustainability is further an ambition that has wide appeal, making it one of the first normative concepts of the Anthropocene.

Despite its widespread emergence and adoption, sustainability science continues to suffer from definitional ambiguity within the academe. In Chapter 3, a review of efforts to provide direction and structure to the science reveals a continuum of approaches anchored at either end by differing visions of how the science interfaces with practice (solutions). At one end, basic science of societally defined problems informs decisions about possible solutions and their application. At the other end, applied research directly affects the options available to decision makers. While clear from the literature, survey data further suggests that the dichotomy does not appear to be as apparent in the minds of practitioners.

In Chapter 4, the UHI is first addressed at the synoptic, mesoscale. Urban climate is the most immediate manifestation of the warming global climate for the majority of people on earth. Nearly half of those people live in small to medium sized cities, an understudied scale in urban climate research. Widespread characterization would be useful to decision makers in planning and design. Using a multi-method approach, the mesoscale UHI in the study region is characterized and the secular trend over the last sixty years evaluated. Under isolated ideal conditions the findings indicate a UHI of 5.3 ± 0.97 °C to be present in the study area, the magnitude of which is growing over time.

Although urban heat islands (UHI) are well studied, there remain no panaceas for local scale mitigation and adaptation methods, therefore continued attention to characterization of the phenomenon in urban centers of different scales around the globe is required. In Chapter 5, a local scale analysis of the canopy layer and surface UHI in a medium sized city in North Carolina, USA is conducted using multiple methods including stationary urban sensors, mobile transects and remote sensing. Focusing on the ideal conditions for UHI development during an anticyclonic summer heat event, the study observes a range of UHI intensity depending on the method of observation: 8.7 °C from the stationary urban sensors; 6.9 °C from mobile transects; and, 2.2 °C from remote sensing. Additional attention is paid to the diurnal dynamics of the UHI and its correlation with vegetation indices, dewpoint and albedo. Evapotranspiration is shown to drive dynamics in the study region.

Finally, recognizing that a bridge must be established between the physical science community studying the Urban Heat Island (UHI) effect, and the planning community and decision makers implementing urban form and development policies, Chapter 6 evaluates multiple urban form characterization methods. Methods evaluated include local climate zones (LCZ), national land cover database (NCLD) classes and urban cluster analysis (UCA) to determine their utility in describing the distribution of the UHI based on three standard observation types 1) fixed urban temperature sensors, 2) mobile transects and, 3) remote sensing. Bivariate, regression and ANOVA tests are used to conduct the analyses. Findings indicate that the NLCD classes are best correlated to the UHI intensity and distribution in the study area. Further, while the UCA method is not useful directly, the variables included in the method are predictive based on regression analysis so the potential for better model design exists. Land cover variables including albedo, impervious surface fraction and pervious surface fraction are found to dominate the distribution of the UHI in the study area regardless of observation method.

Chapter 7 provides a summary of findings, and offers a brief analysis of their implications for both the scientific discourse generally, and the study area specifically. In general, the work undertaken does not achieve the full ambition of sustainability science, additional work is required to translate findings to practice and more fully evaluate adoption. The implications for planning and development in the local region are addressed in the context of a major light-rail infrastructure project including several systems level considerations like human health and development. Finally, several avenues for future work are outlined. Within the theoretical development of sustainability science, these pathways include more robust evaluations of the theoretical and actual practice. Within the UHI context, these include development of an integrated urban form characterization model, application of study methodology in other geographic areas and at different scales, and use of novel experimental methods including distributed sensor networks and citizen science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spectral albedo has been measured at Dome C since December 2012 in the visible and near infrared (400 - 1050 nm) at sub-hourly resolution using a home-made spectral radiometer. Superficial specific surface area (SSA) has been estimated by fitting the observed albedo spectra to the analytical Asymptotic Approximation Radiative Transfer theory (AART). The dataset includes fully-calibrated albedo and SSA that pass several quality checks as described in the companion article. Only data for solar zenith angles less than 75° have been included, which theoretically spans the period October-March. In addition, to correct for residual errors still affecting data after the calibration, especially at the solar zenith angles higher than 60°, we produced a higher quality albedo time-series as follows: In the SSA estimation process described in the companion paper, a scaling coefficient A between the observed albedo and the theoretical model predictions was introduced to cope with these errors. This coefficient thus provides a first order estimate of the residual error. By dividing the albedo by this coefficient, we produced the "scaled fully-calibrated albedo". We strongly recommend to use the latter for most applications because it generally remains in the physical range 0-1. The former albedo is provided for reference to the companion paper and because it does not depend on the SSA estimation process and its underlying assumptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Copyright history has long been a subject of intense and contested enquiry. Historical narratives about the early development of copyright were first prominently mobilised in eighteenth century British legal discourse, during the so-called Battle of the Booksellers between Scottish and London publishers. The two landmark copyright decisions of that time – Millar v. Taylor (1769) and Donaldson v. Becket (1774) – continue to provoke debate today. The orthodox reading of Millar and Donaldson presents copyright as a natural proprietary right at common law inherent in authors. Revisionist accounts dispute that traditional analysis. These conflicting perspectives have, once again, become the subject of critical scrutiny with the publication of Copyright at Common Law in 1774 by Prof Tomas Gomez-Arostegui in 2014, in the Connecticut Law Review ((2014) 47 Conn. L. Rev. 1) and as a CREATe Working Paper (No. 2014/16, 3 November 2014).

Taking Prof Gomez-Arostegui’s extraordinary work in this area as a point of departure, Dr Elena Cooper and Professor Ronan Deazley (then both academics at CREATe) organised an event, held at the University of Glasgow on 26th and 27th March 2015, to consider the interplay between copyright history and contemporary copyright policy. Is Donaldson still relevant, and, if so, why? What justificatory goals are served by historical investigation, and what might be learned from the history of the history of copyright? Does the study of copyright history still have any currency within an evidence-based policy context that is increasingly preoccupied with economic impact analysis?

This paper provides a lasting record of these discussions, including an editorial introduction, written comments by each of the panelists and Prof. Gomez-Arostegui and an edited transcript of the Symposium debate.