948 resultados para Multicast Application Level
Resumo:
L'obiettivo principale della politica di sicurezza alimentare è quello di garantire la salute dei consumatori attraverso regole e protocolli di sicurezza specifici. Al fine di rispondere ai requisiti di sicurezza alimentare e standardizzazione della qualità, nel 2002 il Parlamento Europeo e il Consiglio dell'UE (Regolamento (CE) 178/2002 (CE, 2002)), hanno cercato di uniformare concetti, principi e procedure in modo da fornire una base comune in materia di disciplina degli alimenti e mangimi provenienti da Stati membri a livello comunitario. La formalizzazione di regole e protocolli di standardizzazione dovrebbe però passare attraverso una più dettagliata e accurata comprensione ed armonizzazione delle proprietà globali (macroscopiche), pseudo-locali (mesoscopiche), ed eventualmente, locali (microscopiche) dei prodotti alimentari. L'obiettivo principale di questa tesi di dottorato è di illustrare come le tecniche computazionali possano rappresentare un valido supporto per l'analisi e ciò tramite (i) l’applicazione di protocolli e (ii) miglioramento delle tecniche ampiamente applicate. Una dimostrazione diretta delle potenzialità già offerte dagli approcci computazionali viene offerta nel primo lavoro in cui un virtual screening basato su docking è stato applicato al fine di valutare la preliminare xeno-androgenicità di alcuni contaminanti alimentari. Il secondo e terzo lavoro riguardano lo sviluppo e la convalida di nuovi descrittori chimico-fisici in un contesto 3D-QSAR. Denominata HyPhar (Hydrophobic Pharmacophore), la nuova metodologia così messa a punto è stata usata per esplorare il tema della selettività tra bersagli molecolari strutturalmente correlati e ha così dimostrato di possedere i necessari requisiti di applicabilità e adattabilità in un contesto alimentare. Nel complesso, i risultati ci permettono di essere fiduciosi nel potenziale impatto che le tecniche in silico potranno avere nella identificazione e chiarificazione di eventi molecolari implicati negli aspetti tossicologici e nutrizionali degli alimenti.
Resumo:
Over a number of years, as the Higher Education Funding Council for England (HEFCE)'s funding models became more transparent, Aston University was able to discover how its funding for teaching and research was calculated. This enabled calculations to be made on the funds earned by each school in the University, and Aston Business School (ABS) in turn to develop models to calculate the funds earned by its programmes and academic groups. These models were a 'load' and a 'contribution' model. The 'load' model records the weighting of activities undertaken by individual members of staff; the 'contribution' model is the means by which funds are allocated to academic units. The 'contribution' model is informed by the 'load' model in determining the volume of activity for which each academic unit is to be funded.
Resumo:
The MISLEM Project comprised representatives from Higher and Vocational Education in four partner countries, Austria, Romania, Slovenia and the UK. In addition to this, representatives from a major UK graduate employment agency and the Austria Quality Assurance Agency were also involved. At the inaugural meeting of the Project, partner teams discussed and agreed upon appropriate methodological processes with which to carry the Project forward.
Resumo:
Linear models reach their limitations in applications with nonlinearities in the data. In this paper new empirical evidence is provided on the relative Euro inflation forecasting performance of linear and non-linear models. The well established and widely used univariate ARIMA and multivariate VAR models are used as linear forecasting models whereas neural networks (NN) are used as non-linear forecasting models. It is endeavoured to keep the level of subjectivity in the NN building process to a minimum in an attempt to exploit the full potentials of the NN. It is also investigated whether the historically poor performance of the theoretically superior measure of the monetary services flow, Divisia, relative to the traditional Simple Sum measure could be attributed to a certain extent to the evaluation of these indices within a linear framework. Results obtained suggest that non-linear models provide better within-sample and out-of-sample forecasts and linear models are simply a subset of them. The Divisia index also outperforms the Simple Sum index when evaluated in a non-linear framework. © 2005 Taylor & Francis Group Ltd.
Resumo:
This paper complements the preceding one by Clarke et al, which looked at the long-term impact of retail restructuring on consumer choice at the local level. Whereas the previous paper was based on quantitative evidence from survey research, this paper draws on the qualitative phases of the same three-year study, and in it we aim to understand how the changing forms of retail provision are experienced at the neighbourhood and household level. The empirical material is drawn from focus groups, accompanied shopping trips, diaries, interviews, and kitchen visits with eight households in two contrasting neighbourhoods in the Portsmouth area. The data demonstrate that consumer choice involves judgments of taste, quality, and value as well as more ‘objective’ questions of convenience, price, and accessibility. These judgments are related to households’ differential levels of cultural capital and involve ethical and moral considerations as well as more mundane considerations of practical utility. Our evidence suggests that many of the terms that are conventionally advanced as explanations of consumer choice (such as ‘convenience’, ‘value’, and ‘habit’) have very different meanings according to different household circumstances. To understand these meanings requires us to relate consumers’ at-store behaviour to the domestic context in which their consumption choices are embedded. Bringing theories of practice to bear on the nature of consumer choice, our research demonstrates that consumer choice between stores can be understood in terms of accessibility and convenience, whereas choice within stores involves notions of value, price, and quality. We also demonstrate that choice between and within stores is strongly mediated by consumers’ household contexts, reflecting the extent to which shopping practices are embedded within consumers’ domestic routines and complex everyday lives. The paper concludes with a summary of the overall findings of the project, and with a discussion of the practical and theoretical implications of the study.
Resumo:
This paper examines the extent to which foreign investment in the UK generates wage spillovers in the domestic sector of the economy using a simultaneous dynamic panel data model and focusing on the electronics sector, possibly the most ‘globalized’ sector of UK manufacturing. It finds evidence that the higher wages paid by foreign firms cause wages in the domestic sector to be bid up. This phenomenon is, however, largely confined to the region where foreign direct investment takes place.
Resumo:
This paper examines the impact of innovation on the performance of US business service firms. We distinguish between different levels of innovation (new-to-market and new-to-firm) in our analysis, and allow explicitly for sample selection issues. Reflecting the literature, which highlights the importance of external interaction in service innovation, we pay particular attention to the role of external innovation linkages and their effect on business performance. We find that the presence of service innovation and its extent has a consistently positive effect on growth, but no effect on productivity. There is evidence that the growth effect of innovation can be attributed, at least in part, to the external linkages maintained by innovators in the process of innovation. External linkages have an overwhelmingly positive effect on (innovator) firm performance, regardless of whether innovation is measured as a discrete or continuous variable, and regardless of the level of innovation considered.
Resumo:
The following thesis instigates the discussion on corporate social responsibility (CSR) through a review of literature on the conceptualisation, determinants, and remunerations of organisational CSR engagement. The case is made for the need to draw attention to the micro-levels of CSR, and consequently focus on employee social responsibility at multiple levels of analysis. In order to further research efforts in this area, the prerequisite of an employee social responsibility behavioural measurement tool is acknowledged. Accordingly, the subsequent chapters outline the process of scale development and validation, resulting in a robust, reliable and valid employee social responsibility scale. This scale is then put to use in a field study, and the noteworthy roles of the antecedent and boundary conditions of transformational leadership, assigned CSR priority, and CSR climate are confirmed at the group and individual level. Directionality of these relationships is subsequently alluded to in a time-lagged investigation, set within a simulated business environment. The thesis collates and discusses the contributions of the findings from the research series, which highlight a consistent three-way interaction effect of transformational leadership, assigned CSR priority and CSR climate. Specifically, efforts are made to outline various avenues for future research, given the infancy of the micro-level study of employee social responsibility.
Resumo:
When applying multivariate analysis techniques in information systems and social science disciplines, such as management information systems (MIS) and marketing, the assumption that the empirical data originate from a single homogeneous population is often unrealistic. When applying a causal modeling approach, such as partial least squares (PLS) path modeling, segmentation is a key issue in coping with the problem of heterogeneity in estimated cause-and-effect relationships. This chapter presents a new PLS path modeling approach which classifies units on the basis of the heterogeneity of the estimates in the inner model. If unobserved heterogeneity significantly affects the estimated path model relationships on the aggregate data level, the methodology will allow homogenous groups of observations to be created that exhibit distinctive path model estimates. The approach will, thus, provide differentiated analytical outcomes that permit more precise interpretations of each segment formed. An application on a large data set in an example of the American customer satisfaction index (ACSI) substantiates the methodology’s effectiveness in evaluating PLS path modeling results.
Resumo:
Market-level information diffused by print media may contribute to the legitimation of an emerging technology and thus influence the diffusion of competing technological standards. After analyzing more than 10,000 trade media abstracts from the Local Area Networks (LAN) industry published between 1981 and 2000, we found the presence of differential effects on the adoption of competing standards by two market-level information types: technology and product availability. The significance of these effects depends on the technology's order of entry and suggests that high-tech product managers should make strategic use of market-level information by appropriately focusing the content of their communications. © 2007 Elsevier B.V. All rights reserved.
Resumo:
We demonstrate a liquid level sensor based on the surrounding medium refractive index (SRI) sensing using of an excessively tilted fibre Bragg grating (ETFBG). The sensor has low thermal cross sensitivity and high SRI responsivity.
Resumo:
We propose a novel technique for optical liquid level sensing. The technique takes advantage of an optical spectrum spreading technique and directly measures liquid level with a digital format. The performance of the sensor does not suffer from changes of environmental variables and system variables. Due to its distinct measurement principle both high resolution and a large measurement range can be achieved simultaneously.
Resumo:
We address the question of how to obtain effective fusion of identification information such that it is robust to the quality of this information. As well as technical issues data fusion is encumbered with a collection of (potentially confusing) practical considerations. These considerations are described during the early chapters in which a framework for data fusion is developed. Following this process of diversification it becomes clear that the original question is not well posed and requires more precise specification. We use the framework to focus on some of the technical issues relevant to the question being addressed. We show that fusion of hard decisions through use of an adaptive version of the maximum a posteriori decision rule yields acceptable performance. Better performance is possible using probability level fusion as long as the probabilities are accurate. Of particular interest is the prevalence of overconfidence and the effect it has on fused performance. The production of accurate probabilities from poor quality data forms the latter part of the thesis. Two approaches are taken. Firstly the probabilities may be moderated at source (either analytically or numerically). Secondly, the probabilities may be transformed at the fusion centre. In each case an improvement in fused performance is demonstrated. We therefore conclude that in order to obtain robust fusion care should be taken to model the probabilities accurately; either at the source or centrally.
Resumo:
The proliferation of data throughout the strategic, tactical and operational areas within many organisations, has provided a need for the decision maker to be presented with structured information that is appropriate for achieving allocated tasks. However, despite this abundance of data, managers at all levels in the organisation commonly encounter a condition of ‘information overload’, that results in a paucity of the correct information. Specifically, this thesis will focus upon the tactical domain within the organisation and the information needs of management who reside at this level. In doing so, it will argue that the link between decision making at the tactical level in the organisation, and low-level transaction processing data, should be through a common object model that used a framework based upon knowledge leveraged from co-ordination theory. In order to achieve this, the Co-ordinated Business Object Model (CBOM) was created. Detailing a two-tier framework, the first tier models data based upon four interactive object models, namely, processes, activities, resources and actors. The second tier analyses the data captured by the four object models, and returns information that can be used to support tactical decision making. In addition, the Co-ordinated Business Object Support System (CBOSS), is a prototype tool that has been developed in order to both support the CBOM implementation, and to also demonstrate the functionality of the CBOM as a modelling approach for supporting tactical management decision making. Containing a graphical user interface, the system’s functionality allows the user to create and explore alternative implementations of an identified tactical level process. In order to validate the CBOM, three verification tests have been completed. The results provide evidence that the CBOM framework helps bridge the gap between low level transaction data, and the information that is used to support tactical level decision making.