455 resultados para Bang-bang Pll


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Researching with older participants presents many unique methodological challenges. One of the reasons for this is the greater variability in abilities among older than among younger people. Thus, the standard practice in user research of assuming homogeneity in a certain demographic group may not work with older adults. Designing experiments for users with diverse capabilities is challenging and calls for re-examination of existing experimental design methods. In this paper we will share our experience in researching with people with diverse capabilities and present its implications and possible way to address them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Globalisation leads people getting a chance to move to a different place, to dine in a different context and to experience a different lifestyle. This paper evaluates the designs which offer dining experience in elsewhere, a changed context. A logical narrative review of literature has been conducted to clarify the patterns that restaurant practitioners, designers and social science researchers used for developing dining experience in elsewhere. The paper defines two hourglass balance pattern via food between diner and dining experience providers, as well as a set of interactive strategies in dining experience design. The former can be regarded as an example of the latter pattern. The findings indicate an empathetic setting design framework is needed in future research. This is the first paper that examines the dining experience in light of the atmosphere caused by people’s physical and psychological mobility flow in modern society. The findings provide an access to establish dining experience design framework in future research, that is, achieve various levels of diners’ needs in dining setting design by distributing the multisensory effects to activate diners’ involvement in the dining experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of an international economic shift from manufacturing to services and the constant expansion of industries towards online services (Sheth and Sharma, 2008), this study is concerned with the design of self-service technologies (SSTs) for online environments. An industry heavily adopting SSTs across a variety of different services is Health and Wellness, where figures show an ever growing number of health and wellness apps being developed, downloaded and abandoned (Kelley, 2014). Little is known about how to enhance people’s engagement with online wellness SSTs to support self-health management and self-efficacy. This literature review argues that service design of wellness SSTs in online contexts can be improved by developing an enhanced understanding from a people perspective and customer experience point of view. Customer value, quality of service, usability, and self-efficacy all play an important role in understanding how to design SSTs for wellness and keep users engaged. There is a need for further study on how people interact and engage with online services in the context of wellness in order to design engaging wellness services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Beth Woods has been hailed the Queen of Rice by the Australian Centre for International Agricultural Research (ACIAR). Beth’s decade-long relationship with the International Rice Research Institute driving research innovations that make large and measurable changes for rice farmers has received due recognition in a recent article published by ACIAR’s Partner’s Magazine. Her particular expertise relates to structures and strategies that help get ‘the most bang’ from the money invested in research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal bang-coast maintenance policies for a machine, subject to failure, are considered. The approach utilizes a semi-Markov model for the system. A simplified model for modifying the probability of machine failure with maintenance is employed. A numerical example is presented to illustrate the procedure and results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 1956 Whitham gave a nonlinear theory for computing the intensity of an acoustic pulse of an arbitrary shape. The theory has been used very successfully in computing the intensity of the sonic bang produced by a supersonic plane. [4.] derived an approximate quasi-linear equation for the propagation of a short wave in a compressible medium. These two methods are essentially nonlinear approximations of the perturbation equations of the system of gas-dynamic equations in the neighborhood of a bicharacteristic curve (or rays) for weak unsteady disturbances superimposed on a given steady solution. In this paper we have derived an approximate quasi-linear equation which is an approximation of perturbation equations in the neighborhood of a bicharacteristic curve for a weak pulse governed by a general system of first order quasi-linear partial differential equations in m + 1 independent variables (t, x1,…, xm) and derived Gubkin's result as a particular case when the system of equations consists of the equations of an unsteady motion of a compressible gas. We have also discussed the form of the approximate equation describing the waves propagating upsteam in an arbitrary multidimensional transonic flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, we live in an era characterized by the completion and first runs of the LHC accelerator at CERN, which is hoped to provide the first experimental hints of what lies beyond the Standard Model of particle physics. In addition, the last decade has witnessed a new dawn of cosmology, where it has truly emerged as a precision science. Largely due to the WMAP measurements of the cosmic microwave background, we now believe to have quantitative control of much of the history of our universe. These two experimental windows offer us not only an unprecedented view of the smallest and largest structures of the universe, but also a glimpse at the very first moments in its history. At the same time, they require the theorists to focus on the fundamental challenges awaiting at the boundary of high energy particle physics and cosmology. What were the contents and properties of matter in the early universe? How is one to describe its interactions? What kind of implications do the various models of physics beyond the Standard Model have on the subsequent evolution of the universe? In this thesis, we explore the connection between in particular supersymmetric theories and the evolution of the early universe. First, we provide the reader with a general introduction to modern day particle cosmology from two angles: on one hand by reviewing our current knowledge of the history of the early universe, and on the other hand by introducing the basics of supersymmetry and its derivatives. Subsequently, with the help of the developed tools, we direct the attention to the specific questions addressed in the three original articles that form the main scientific contents of the thesis. Each of these papers concerns a distinct cosmological problem, ranging from the generation of the matter-antimatter asymmetry to inflation, and finally to the origin or very early stage of the universe. They nevertheless share a common factor in their use of the machinery of supersymmetric theories to address open questions in the corresponding cosmological models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis we examine multi-field inflationary models of the early Universe. Since non-Gaussianities may allow for the possibility to discriminate between models of inflation, we compute deviations from a Gaussian spectrum of primordial perturbations by extending the delta-N formalism. We use N-flation as a concrete model; our findings show that these models are generically indistinguishable as long as the slow roll approximation is still valid. Besides computing non-Guassinities, we also investigate Preheating after multi-field inflation. Within the framework of N-flation, we find that preheating via parametric resonance is suppressed, an indication that it is the old theory of preheating that is applicable. In addition to studying non-Gaussianities and preheatng in multi-field inflationary models, we study magnetogenesis in the early universe. To this aim, we propose a mechanism to generate primordial magnetic fields via rotating cosmic string loops. Magnetic fields in the micro-Gauss range have been observed in galaxies and clusters, but their origin has remained elusive. We consider a network of strings and find that rotating cosmic string loops, which are continuously produced in such networks, are viable candidates for magnetogenesis with relevant strength and length scales, provided we use a high string tension and an efficient dynamo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The first quarter of the 20th century witnessed a rebirth of cosmology, study of our Universe, as a field of scientific research with testable theoretical predictions. The amount of available cosmological data grew slowly from a few galaxy redshift measurements, rotation curves and local light element abundances into the first detection of the cos- mic microwave background (CMB) in 1965. By the turn of the century the amount of data exploded incorporating fields of new, exciting cosmological observables such as lensing, Lyman alpha forests, type Ia supernovae, baryon acoustic oscillations and Sunyaev-Zeldovich regions to name a few. -- CMB, the ubiquitous afterglow of the Big Bang, carries with it a wealth of cosmological information. Unfortunately, that information, delicate intensity variations, turned out hard to extract from the overall temperature. Since the first detection, it took nearly 30 years before first evidence of fluctuations on the microwave background were presented. At present, high precision cosmology is solidly based on precise measurements of the CMB anisotropy making it possible to pinpoint cosmological parameters to one-in-a-hundred level precision. The progress has made it possible to build and test models of the Universe that differ in the way the cosmos evolved some fraction of the first second since the Big Bang. -- This thesis is concerned with the high precision CMB observations. It presents three selected topics along a CMB experiment analysis pipeline. Map-making and residual noise estimation are studied using an approach called destriping. The studied approximate methods are invaluable for the large datasets of any modern CMB experiment and will undoubtedly become even more so when the next generation of experiments reach the operational stage. -- We begin with a brief overview of cosmological observations and describe the general relativistic perturbation theory. Next we discuss the map-making problem of a CMB experiment and the characterization of residual noise present in the maps. In the end, the use of modern cosmological data is presented in the study of an extended cosmological model, the correlated isocurvature fluctuations. Current available data is shown to indicate that future experiments are certainly needed to provide more information on these extra degrees of freedom. Any solid evidence of the isocurvature modes would have a considerable impact due to their power in model selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Acceleration of the universe has been established but not explained. During the past few years precise cosmological experiments have confirmed the standard big bang scenario of a flat universe undergoing an inflationary expansion in its earliest stages, where the perturbations are generated that eventually form into galaxies and other structure in matter, most of which is non-baryonic dark matter. Curiously, the universe has presently entered into another period of acceleration. Such a result is inferred from observations of extra-galactic supernovae and is independently supported by the cosmic microwave background radiation and large scale structure data. It seems there is a positive cosmological constant speeding up the universal expansion of space. Then the vacuum energy density the constant describes should be about a dozen times the present energy density in visible matter, but particle physics scales are enormously larger than that. This is the cosmological constant problem, perhaps the greatest mystery of contemporary cosmology. In this thesis we will explore alternative agents of the acceleration. Generically, such are called dark energy. If some symmetry turns off vacuum energy, its value is not a problem but one needs some dark energy. Such could be a scalar field dynamically evolving in its potential, or some other exotic constituent exhibiting negative pressure. Another option is to assume that gravity at cosmological scales is not well described by general relativity. In a modified theory of gravity one might find the expansion rate increasing in a universe filled by just dark matter and baryons. Such possibilities are taken here under investigation. The main goal is to uncover observational consequences of different models of dark energy, the emphasis being on their implications for the formation of large-scale structure of the universe. Possible properties of dark energy are investigated using phenomenological paramaterizations, but several specific models are also considered in detail. Difficulties in unifying dark matter and dark energy into a single concept are pointed out. Considerable attention is on modifications of gravity resulting in second order field equations. It is shown that in a general class of such models the viable ones represent effectively the cosmological constant, while from another class one might find interesting modifications of the standard cosmological scenario yet allowed by observations. The thesis consists of seven research papers preceded by an introductory discussion.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A plethora of indices have been proposed and used to construct dominance hierarchies in a variety of vertebrate and invertebrate societies, although the rationale for choosing a particular index for a particular species is seldom explained. In this study, we analysed and compared three such indices, viz Clutton-Brock et al.'s index (CBI), originally developed for red deer, Cervus elaphus, David's score (DS) originally proposed by the statistician H. A. David and the frequency-based index of dominance (FDI) developed and routinely used by our group for the primitively eusocial wasps Ropalidia marginata and Ropalidia cyathiformis. Dominance ranks attributed by all three indices were strongly and positively correlated for both natural data sets from the wasp colonies and for artificial data sets generated for the purpose. However, the indices differed in their ability to yield unique (untied) ranks in the natural data sets. This appears to be caused by the presence of noninteracting individuals and reversals in the direction of dominance in some of the pairs in the natural data sets. This was confirmed by creating additional artificial data sets with noninteracting individuals and with reversals. Based on the criterion of yielding the largest proportion of unique ranks, we found that FDI is best suited for societies such as the wasps belonging to Ropalidia, DS is best suited for societies with reversals and CBI remains a suitable index for societies such as red deer in which multiple interactions are uncommon. (C) 2009 The Association for the Study of Animal Behaviour. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grid connected PWM-VSIs are being increasingly used for applications such as Distributed Generation (DG), power quality, UPS etc. Appropriate control strategies for grid synchronisation and line current regulation are required to establish such a grid interconnection and power transfer. Control of three phase VSIs is widely reported in iterature. Conventionally, dq control in Synchronous Reference Frame(SRF) is employed for both PLL and line current control where PI-controllers are used to track the DC references. Single phase systems do not have defined direct (d) and quadrature (q) axis components that are required for SRF transformation. Thus, references are AC in nature and hence usage of PI controllers cannot yield zero steady state errors. Resonant controllers have the ability to track AC references accurately. In this work, a resonant controller based single phase PLL and current control technique are being employed for tracking grid frequency and the AC current reference respectively. A single phase full bridge converter is being operated as a STATCOM for performance evaluation of the control scheme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work deals with an ultrasonic type of wave propagation characteristics of monolayer graphene on silicon (Si) substrate. An atomistic model of a hybrid lattice involving a hexagonal lattice of graphene and surface atoms of diamond lattice of Si is developed to identify the carbon-silicon bond stiffness. Properties of this hybrid lattice model is then mapped into a nonlocal continuum framework. Equivalent force constant due to Si substrate is obtained by minimizing the total potential energy of the system. For this equilibrium configuration, the nonlocal governing equations are derived to analyze the ultrasonic wave dispersion based on spectral analysis. From the present analysis we show that the silicon substrate affects only the flexural wave mode. The frequency band gap of flexural mode is also significantly affected by this substrate. The results also show that, the silicon substrate adds cushioning effect to the graphene and it makes the graphene more stable. The analysis also show that the frequency bang gap relations of in-plane (longitudinal and lateral) and out-of-plane (flexural) wave modes depends not only on the y-direction wavenumber but also on nonlocal scaling parameter. In the nonlocal analysis, at higher values of the y-directional wavenumber, a decrease in the frequency band gap is observed for all the three fundamental wave modes in the graphene-silicon system. The atoms movement in the graphene due to the wave propagation are also captured for all the tree fundamental wave modes. The results presented in this work are qualitatively different from those obtained based on the local analysis and thus, are important for the development of graphene based nanodevices such as strain sensor, mass and pressure sensors, atomic dust detectors and enhancer of surface image resolution that make use of the ultrasonic wave dispersion properties of graphene. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal preventive maintenance policies, for a machine subject to deterioration with age and intermittent breakdowns and repairs, are derived using optimal control theory. The optimal policies are shown to be of bang-bang nature. The extension to the case when there are a large number of identical machines and several repairmen in the system is considered next. This model takes into account the waiting line formed at the repair facility and establishes a link between this problem and the classical ``repairmen problem.''