149 resultados para decode and forward


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the modelling of dielectric responses of amorphous biological samples. Such samples are commonly encountered in impedance spectroscopy studies as well as in UV, IR, optical and THz transient spectroscopy experiments and in pump-probe studies. In many occasions, the samples may display quenched absorption bands. A systems identification framework may be developed to provide parsimonious representations of such responses. To achieve this, it is appropriate to augment the standard models found in the identification literature to incorporate fractional order dynamics. Extensions of models using the forward shift operator, state space models as well as their non-linear Hammerstein-Wiener counterpart models are highlighted. We also discuss the need to extend the theory of electromagnetically excited networks which can account for fractional order behaviour in the non-linear regime by incorporating nonlinear elements to account for the observed non-linearities. The proposed approach leads to the development of a range of new chemometrics tools for biomedical data analysis and classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the logics or values that shape the social and environmental reporting (SER) and SER assurance (SERA) process. The influence of logics is observed through a study of the conceptualisation and operationalisation of the materiality concept by accounting and non-accounting assurors and their assurance statements. We gathered qualitative data from interviews with both accounting and non-accounting assurors. We analysed the interplay between old and new logics that are shaping materiality as a reporting concept in SER. SER is a rich field in which to study the dynamics of change because it is a voluntary, unregulated, qualitative reporting arena. It has a broad, stakeholder audience, where accounting and non-accounting organisations are in competition. There are three key findings. First, the introduction of a new, stakeholder logic has significantly changed the meaning and role of materiality. Second, a more versatile, performative, social understanding of materiality was portrayed by assurors, with a forward-looking rather than a historic focus. Third, competing logics have encouraged different beliefs about materiality, and practices, to develop. This influenced the way assurors theorised the concept and interpreted outcomes. A patchwork of localised understandings of materiality is developing. Policy implications both in SERA and also in financial audit are explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a systematic approach to determine the most suitable analogue redesign method to be used for forward-type converters under digital voltage mode control. The focus of the method is to achieve the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration with pre-warping have the largest phase margins. An algorithm has been developed to determine the frequency of the crossing point where the recommended discretisation method changes. An accurate model of the power stage is used for simulation and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent closeness between the simulation and experimental results is presented. This work provides a concrete example to allow academics and engineers to systematically choose a discretisation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article critically reflects on the widely held view of a causal chain with trust in public authorities impacting technology acceptance via perceived risk. It first puts forward conceptual reason against this view, as the presence of risk is a precondition for trust playing a role in decision making. Second, results from consumer surveys in Italy and Germany are presented that support the associationist model as counter hypothesis. In that view, trust and risk judgments are driven by and thus simply indicators of higher order attitudes toward a certain technology which determine acceptance instead. The implications of these findings are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is proposed for merging different nadir-sounding climate data records using measurements from high-resolution limb sounders to provide a transfer function between the different nadir measurements. The two nadir-sounding records need not be overlapping so long as the limb-sounding record bridges between them. The method is applied to global-mean stratospheric temperatures from the NOAA Climate Data Records based on the Stratospheric Sounding Unit (SSU) and the Advanced Microwave Sounding Unit-A (AMSU), extending the SSU record forward in time to yield a continuous data set from 1979 to present, and providing a simple framework for extending the SSU record into the future using AMSU. SSU and AMSU are bridged using temperature measurements from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS), which is of high enough vertical resolution to accurately represent the weighting functions of both SSU and AMSU. For this application, a purely statistical approach is not viable since the different nadir channels are not sufficiently linearly independent, statistically speaking. The near-global-mean linear temperature trends for extended SSU for 1980–2012 are −0.63 ± 0.13, −0.71 ± 0.15 and −0.80 ± 0.17 K decade−1 (95 % confidence) for channels 1, 2 and 3, respectively. The extended SSU temperature changes are in good agreement with those from the Microwave Limb Sounder (MLS) on the Aura satellite, with both exhibiting a cooling trend of ~ 0.6 ± 0.3 K decade−1 in the upper stratosphere from 2004 to 2012. The extended SSU record is found to be in agreement with high-top coupled atmosphere–ocean models over the 1980–2012 period, including the continued cooling over the first decade of the 21st century.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article is concerned with the liability of search engines for algorithmically produced search suggestions, such as through Google’s ‘autocomplete’ function. Liability in this context may arise when automatically generated associations have an offensive or defamatory meaning, or may even induce infringement of intellectual property rights. The increasing number of cases that have been brought before courts all over the world puts forward questions on the conflict of fundamental freedoms of speech and access to information on the one hand, and personality rights of individuals— under a broader right of informational self-determination—on the other. In the light of the recent judgment of the Court of Justice of the European Union (EU) in Google Spain v AEPD, this article concludes that many requests for removal of suggestions including private individuals’ information will be successful on the basis of EU data protection law, even absent prejudice to the person concerned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article discusses some of the issues relating to the promotion of Core Maths to students, parents, schools and colleges and their senior leadership teams, and also to employers and higher education (HE). Some challenges are highlighted, and addressed, with suggestions for ways forward to secure the future of Core Maths and widespread adoption by all stakeholders. A summary of the background reports that led to the introduction of Core Maths, and the related educational landscape prior to its introduction, is included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to increase overall transparency on key operational information, power transmission system operators publish an increasing amount of fundamental data, including forecasts of electricity demand and available capacity. We employ a fundamental model for electricity prices which lends itself well to integrating such forecasts, while retaining ease of implementation and tractability to allow for analytic derivatives pricing formulae. In an extensive futures pricing study, the pricing performance of our model is shown to further improve based on the inclusion of electricity demand and capacity forecasts, thus confirming the general importance of forward-looking information for electricity derivatives pricing. However, we also find that the usefulness of integrating forecast data into the pricing approach is primarily limited to those periods during which electricity prices are highly sensitive to demand or available capacity, whereas the impact is less visible when fuel prices are the primary underlying driver to prices instead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than two decades have passed since the fall of the Berlin Wall and the transfer of the Cold War file from a daily preoccupation of policy makers to a more detached assessment by historians. Scholars of U.S.-Latin American relations are beginning to take advantage both of the distance in time and of newly opened archives to reflect on the four decades that, from the 1940s to the 1980s, divided the Americas, as they did much of the world. Others are seeking to understand U.S. policy and inter-American relations in the post-Cold War era, a period that not only lacks a clear definition but also still has no name. Still others have turned their gaze forward to offer policies in regard to the region for the new Obama administration. Numerous books and review essays have addressed these three subjects—the Cold War, the post-Cold War era, and current and future issues on the inter-American agenda. Few of these studies attempt, however, to connect the three subjects or to offer new and comprehensive theories to explain the course of U.S. policies from the beginning of the twentieth century until the present. Indeed, some works and policy makers continue to use the mind-sets of the Cold War as though that conflict were still being fought. With the benefit of newly opened archives, some scholars have nevertheless drawn insights from the depths of the Cold War that improve our understanding of U.S. policies and inter-American relations, but they do not address the question as to whether the United States has escaped the longer cycle of intervention followed by neglect that has characterized its relations with Latin America. Another question is whether U.S. policies differ markedly before, during, and after the Cold War. In what follows, we ask whether the books reviewed here provide any insights in this regard and whether they offer a compass for the future of inter-American relations. We also offer our own thoughts as to how their various perspectives could be synthesized to address these questions more comprehensively.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since the first reported case of HIV infection in Hong Kong in 1985, only two HIV-positive individuals in the territory have voluntarily made public their seropositivity: a British dentist named Mike Sinclair, who disclosed his condition to the media in 1992 and died in 1995, and J.J. Chan, a local Chinese disc-jockey, who came forward in 1995 and died just a few months later. When they made their revelations, both became instant media personalities and were invited by the Hong Kong Government to act as spokespeople for AIDS awareness and prevention. Mike Sinclair worked as an education officer for the Hong Kong AIDS Foundation, and J.J. Chan appeared in Government television commercials about AIDS. This article explores how the public identities of these two figures were constructed in the cultural context of Hong Kong where both Eastern and Western values exist side by side and interact. It argues that the construction of `AIDS celebrities' is a kind of `identity project' negotiated among the players involved: the media, the Government, the public, and the person with AIDS (PWA) himself, each bringing to the construction their own `theories' regarding the self and communication. When the players in the construction hold shared assumptions about the nature of the self and the role of communication in enacting it, harmonious discourses arise, but when cultural models among the players differ, contradictory or ambiguous constructions result. The effect of culture on the way `AIDS celebrities' are constructed has implications for the way societies view the issue of AIDS and treat those who have it. It also helps reveal possible sites of difficulty when individuals of different cultures communicate about the issue.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In spite of trying to understand processes in the same spatial domain, the catchment hydrology and water quality scientific communities are relatively disconnected and so are their respective models. This is emphasized by an inadequate representation of transport processes, in both catchment-scale hydrological and water quality models. While many hydrological models at the catchment scale only account for pressure propagation and not for mass transfer, catchment scale water quality models are typically limited by overly simplistic representations of flow processes. With the objective of raising awareness for this issue and outlining potential ways forward we provide a non-technical overview of (1) the importance of hydrology-controlled transport through catchment systems as the link between hydrology and water quality; (2) the limitations of current generation catchment-scale hydrological and water quality models; (3) the concept of transit times as tools to quantify transport and (4) the benefits of transit time based formulations of solute transport for catchment-scale hydrological and water quality models. There is emerging evidence that an explicit formulation of transport processes, based on the concept of transit times has the potential to improve the understanding of the integrated system dynamics of catchments and to provide a stronger link between catchment-scale hydrological and water quality models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spectroscopic catalogues, such as GEISA and HITRAN, do not yet include information on the water vapour continuum that pervades visible, infrared and microwave spectral regions. This is partly because, in some spectral regions, there are rather few laboratory measurements in conditions close to those in the Earth’s atmosphere; hence understanding of the characteristics of the continuum absorption is still emerging. This is particularly so in the near-infrared and visible, where there has been renewed interest and activity in recent years. In this paper we present a critical review focusing on recent laboratory measurements in two near-infrared window regions (centred on 4700 and 6300 cm−1) and include reference to the window centred on 2600 cm−1 where more measurements have been reported. The rather few available measurements, have used Fourier transform spectroscopy (FTS), cavity ring down spectroscopy, optical-feedback – cavity enhanced laser spectroscopy and, in very narrow regions, calorimetric interferometry. These systems have different advantages and disadvantages. Fourier Transform Spectroscopy can measure the continuum across both these and neighbouring windows; by contrast, the cavity laser techniques are limited to fewer wavenumbers, but have a much higher inherent sensitivity. The available results present a diverse view of the characteristics of continuum absorption, with differences in continuum strength exceeding a factor of 10 in the cores of these windows. In individual windows, the temperature dependence of the water vapour self-continuum differs significantly in the few sets of measurements that allow an analysis. The available data also indicate that the temperature dependence differs significantly between different near-infrared windows. These pioneering measurements provide an impetus for further measurements. Improvements and/or extensions in existing techniques would aid progress to a full characterisation of the continuum – as an example, we report pilot measurements of the water vapour self-continuum using a supercontinuum laser source coupled to an FTS. Such improvements, as well as additional measurements and analyses in other laboratories, would enable the inclusion of the water vapour continuum in future spectroscopic databases, and therefore allow for a more reliable forward modelling of the radiative properties of the atmosphere. It would also allow a more confident assessment of different theoretical descriptions of the underlying cause or causes of continuum absorption.