948 resultados para Multicast Application Level


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software development methodologies are becoming increasingly abstract, progressing from low level assembly and implementation languages such as C and Ada, to component based approaches that can be used to assemble applications using technologies such as JavaBeans and the .NET framework. Meanwhile, model driven approaches emphasise the role of higher level models and notations, and embody a process of automatically deriving lower level representations and concrete software implementations. The relationship between data and software is also evolving. Modern data formats are becoming increasingly standardised, open and empowered in order to support a growing need to share data in both academia and industry. Many contemporary data formats, most notably those based on XML, are self-describing, able to specify valid data structure and content, and can also describe data manipulations and transformations. Furthermore, while applications of the past have made extensive use of data, the runtime behaviour of future applications may be driven by data, as demonstrated by the field of dynamic data driven application systems. The combination of empowered data formats and high level software development methodologies forms the basis of modern game development technologies, which drive software capabilities and runtime behaviour using empowered data formats describing game content. While low level libraries provide optimised runtime execution, content data is used to drive a wide variety of interactive and immersive experiences. This thesis describes the Fluid project, which combines component based software development and game development technologies in order to define novel component technologies for the description of data driven component based applications. The thesis makes explicit contributions to the fields of component based software development and visualisation of spatiotemporal scenes, and also describes potential implications for game development technologies. The thesis also proposes a number of developments in dynamic data driven application systems in order to further empower the role of data in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the dynamics of firm-level financing and investment decisions for six Southeast Asian countries. The study provides empirical evidence on the impacts of changes in the firm-level financing decisions during the period of financial liberalization by considering the debt and equity financing decisions of a set of non-financial firms. The empirical results show that firms in Indonesia, Pakistan, and South Korea have relatively faster speed of adjustment than other Southeast Asian countries to attain optimal debt and equity ratios in response to banking sector and stock market liberalization. In addition, contrary to widely held belief that firms adjust their financial ratios to industry levels, the results indicate that industry factors do not significantly impact on the speed of capital structure adjustments. This study also shows that non-linear estimation methods are more appropriate than linear estimation methods for capturing changes in capital structure. The empirical results also show that international stock market integration of these countries has significantly reduced the equity risk premium as well as the firm-level cost of equity capital. Thus stock market liberalization is associated with a decrease in the cost of equity capital of the firms. Developments in the securities markets infrastructure have also reduced the cost of equity capital. However, with increased integration there is the possibility of capital outflows from the emerging markets, which might reverse the pattern of decrease in cost of capital in these markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study has concentrated on the development of an impact simulation model for use at the sub-national level. The necessity for the development of this model was demonstrated by the growth of local economic initiatives during the 1970's, and the lack of monitoring and evaluation exercise to assess their success and cost-effectiveness. The first stage of research involved the confirmation that the potential for micro-economic and spatial initiatives existed. This was done by identifying the existence of involuntary structural unemployment. The second stage examined the range of employment policy options from the macroeconomic, micro-economic and spatial perspectives, and focused on the need for evaluation of those policies. The need for spatial impact evaluation exercise in respect of other exogenous shocks, and structural changes was also recognised. The final stage involved the investigation of current techniques of evaluation and their adaptation for the purpose in hand. This led to a recognition of a gap in the armoury of techniques. The employment-dependency model has been developed to fill that gap, providing a low-budget model, capable of implementation at the small area level and generating a vast array of industrially disaggregate data, in terms of employment, employment-income, profits, value-added and gross income, related to levels of United Kingdom final demand. Thus providing scope for a variety of impact simulation exercises.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the internal dynamics of white collar trade union branches in the public sector. The effects of a number of internal and external factors on branch patterns of action are evaluated. For the purposes of the study branch action is taken to be the approach to issues of job regulation, as expressed along the five dimensions of dependence on the outside trade union, focus in issues adopted, initiation of issues, intensity of action in issue pursuit and representativeness. The setting chosen for the study is four branches drawn from the same geographical area of the National and Local Government Officers Association. Branches were selected to give a variety in industry settings while controlling for the potentially influential variables of branch size, density of trade union membership and possession of exclusive representational rights in the employing organisation. Identical methods of data collection were used for each branch. The principal findings of the study are that the framework of national agreements and industry collective bargaining structures are strongly related to the industrial relations climate in the employing organisation and the structures of representation within the branch. Where agreements and collective bargaining structures formally restrict branch job regulation roles, there is a degree of devolution of bargaining authority from branch level negotiators to autonomous shop stewards at workplace level. In these circumstances industrial relations climate is characterised by a degree of informality in relationships between management and trade union activists. In turn, industrial relations climate and representative structures together with actor attitudes, have strong effects on all dimensions of approach to issues of job regulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The possibility that developmental dyslexia results from low-level sensory processing deficits has received renewed interest in recent years. Opponents of such sensory-based explanations argue that dyslexia arises primarily from phonological impairments. However, many behavioural correlates of dyslexia cannot be explained sufficiently by cognitive-level accounts and there is anatomical, psychometric and physiological evidence of sensory deficits in the dyslexic population. This thesis aims to determine whether the low-level (pre-attentive) processing of simple auditory stimuli is disrupted in compensated adult dyslexics. Using psychometric and neurophysiological measures, the nature of auditory processing abnormalities is investigated. Group comparisons are supported by analysis of individual data in order to address the issue of heterogeneity in dyslexia. The participant pool consisted of seven compensated dyslexic adults and seven age and IQ matched controls. The dyslexic group were impaired, relative to the control group, on measures of literacy, phonological awareness, working memory and processing speed. Magnetoencephalographic recordings were conducted during processing of simple, non-speech, auditory stimuli. Results confirm that low-level auditory processing deficits are present in compensated dyslexic adults. The amplitude of N1m responses to tone pair stimuli were reduced in the dyslexic group. However, there was no evidence that manipulating either the silent interval or the frequency separation between the tones had a greater detrimental effect on dyslexic participants specifically. Abnormal MMNm responses were recorded in response to frequency deviant stimuli in the dyslexic group. In addition, complete stimulus omissions, which evoked MMNm responses in all control participants, failed to elicit significant MMNm responses in all but one of the dyslexic individuals. The data indicate both a deficit of frequency resolution at a local level of auditory processing and a higher-level deficit relating to the grouping of auditory stimuli, relevant for auditory scene analysis. Implications and directions for future research are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are now more postgraduate programmes that include qualitative methods in psychology than ever before. This poses problems for teaching qualitative methods at M level because we still lack consistency in what qualitative methods are taught at the undergraduate level. Although the British Psychological Society requires accredited undergraduate programmes to include qualitative methods, we hear very different stories from colleagues across the UK about provision and quality. In this article, we present a dialogue between learner and teacher about our own experiences of qualitative methods in psychology at M level. We report our own learning experiences of qualitative methods at the undergraduate level, reflect on current methods of teaching at M level, and consider ways of moving forward. As well as focusing specifically on current practice at our institution, our discussions also branch out into wider issues around the fundamental characteristics of qualitative methods, pragmatically and philosophically, as well as our own accounts of what we enjoy most about using qualitative methods in psychology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research involves a study of the questions, "what is considered safe", how are safety levels defined or decided, and according to whom. Tolerable or acceptable risk questions raise various issues: about values and assumptions inherent in such levels; about decision-making frameworks at the highest level of policy making as well as on the individual level; and about the suitability and competency of decision-makers to decide and to communicate their decisions. The wide-ranging topics covering philosophical and practical concerns examined in the literature review reveal the multi-disciplined scope of this research. To support this theoretical study empirical research was undertaken at the European Space Research and Technology Centre (ESTEC) of the European Space Agency (ESA). ESTEC is a large, multi-nationality, high technology organisation which presented an ideal case study for exploring how decisions are made with respect to safety from a personal as well as organisational aspect. A qualitative methodology was employed to gather, analyse and report the findings of this research. Significant findings reveal how experts perceive risks and the prevalence of informal decision-making processes partly due to the inadequacy of formal methods for deciding risk tolerability. In the field of occupational health and safety, this research has highlighted the importance and need for criteria to decide whether a risk is great enough to warrant attention in setting standards and priorities for risk control and resources. From a wider perspective and with the recognition that risk is an inherent part of life, the establishment of tolerability risk levels can be viewed as cornerstones indicating our progress, expectations and values, of life and work, in an increasingly litigious, knowledgeable and global society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the difficulties that we have regarding the use of English in tertiary education in Turkey, we argue that it is necessary for those involved to study in the medium of English. Furthermore, significant advances have been made on this front. These efforts have been for the most part language-oriented, but also include research into needs analysis and the pedagogy of team-teaching. Considering the current situation at this level of education, however, there still seems to be more to do. And the question is, what more can we do? What further contribution can we make? Or, how can we take this process further? The purpose of the study reported here is to respond to this last question. We test the proposition that it is possible to take this process further by investigating the efficient management of transition from Turkish-medium to English-medium at the tertiary level of education in Turkey. Beyond what is achieved by only the language orientation of the EAP approach, and moving conceptually deeper than what has been achieved by the team-teaching approach, the research undertaken for the purpose of this study focuses on the idea of the discourse community that people want to belong to. It then pursues an adaptation of the essentially psycho-social approach of apprenticeship, as people become aspirants and apprentices to that discourse community. In this thesis, the researcher recognises that she cannot follow all the way through to the full implementation of her ideas in a fully-taught course. She is not in a position to change the education system. What she does here is to introduce a concept and sample its effects in terms of motivation, and thereby of integration and of success, for individuals and groups of learners. Evaluation is provided by acquiring both qualitative and quantitative data concerning mature members' perceptions of apprenticed-neophytes functioning as members in the new community, apprenticed-neophytes' perceptions of their own membership and of the preparation process undertaken, and the comparison of these neophytes' performance with that of other neophytes in the community. The data obtained provide strong evidence in support of the potential usefulness of this apprenticeship model towards the declared purpose of improving the English-medium tertiary education of Turkish students in their chosen fields of study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Adaptability for distributed object-oriented enterprise frameworks is a critical mission for system evolution. Today, building adaptive services is a complex task due to lack of adequate framework support in the distributed computing environment. In this thesis, we propose a Meta Level Component-Based Framework (MELC) which uses distributed computing design patterns as components to develop an adaptable pattern-oriented framework for distributed computing applications. We describe our novel approach of combining a meta architecture with a pattern-oriented framework, resulting in an adaptable framework which provides a mechanism to facilitate system evolution. The critical nature of distributed technologies requires frameworks to be adaptable. Our framework employs a meta architecture. It supports dynamic adaptation of feasible design decisions in the framework design space by specifying and coordinating meta-objects that represent various aspects within the distributed environment. The meta architecture in MELC framework can provide the adaptability for system evolution. This approach resolves the problem of dynamic adaptation in the framework, which is encountered in most distributed applications. The concept of using a meta architecture to produce an adaptable pattern-oriented framework for distributed computing applications is new and has not previously been explored in research. As the framework is adaptable, the proposed architecture of the pattern-oriented framework has the abilities to dynamically adapt new design patterns to address technical system issues in the domain of distributed computing and they can be woven together to shape the framework in future. We show how MELC can be used effectively to enable dynamic component integration and to separate system functionality from business functionality. We demonstrate how MELC provides an adaptable and dynamic run time environment using our system configuration and management utility. We also highlight how MELC will impose significant adaptability in system evolution through a prototype E-Bookshop application to assemble its business functions with distributed computing components at the meta level in MELC architecture. Our performance tests show that MELC does not entail prohibitive performance tradeoffs. The work to develop the MELC framework for distributed computing applications has emerged as a promising way to meet current and future challenges in the distributed environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research project has developed a novel decision support system using Geographical Information Systems and Multi Criteria Decision Analysis and used it to develop and evaluate energy-from-waste policy options. The system was validated by applying it to the UK administrative areas of Cornwall and Warwickshire. Different strategies have been defined by the size and number of the facilities, as well as the technology chosen. Using sensitivity on the results from the decision support system, it was found that key decision criteria included those affected by cost, energy efficiency, transport impacts and air/dioxin emissions. The conclusions of this work are that distributed small-scale energy-from-waste facilities score most highly overall and that scale is more important than technology design in determining overall policy impact. This project makes its primary contribution to energy-from-waste planning by its development of a Decision Support System that can be used to assist waste disposal authorities to identify preferred energy-from-waste options that have been tailored specifically to the socio-geographic characteristics of their jurisdictional areas. The project also highlights the potential of energy-from-waste policies that are seldom given enough attention to in the UK, namely those of a smaller-scale and distributed nature that often have technology designed specifically to cater for this market.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis analyses the impact of workplace stressors and mood on innovation activities. Based on three competitive frameworks offered by cognitive spreading activation theory, mood repair perspective, and mood-as-information theory, different sets of predictions are developed. These hypotheses are tested in a field study involving 41 R&D teams and 123 individual R&D workers, and in an experimental study involving 54 teams of students. Results of the field study suggest that stressors and mood interact to predict innovation activities in such a way that with increasing stressors a high positive ( or negative) mood is more detrimental to innovation activities than a low positive (or negative) mood, lending support to the mood repair perspective. These effects are found for both individuals and teams. In the experimental study this effect is replicated and potential boundary conditions and mediators are tested. In addition, this thesis includes the development of an instrument to assess creativity and implementation activities within the realm of task-related innovative performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Influential models of edge detection have generally supposed that an edge is detected at peaks in the 1st derivative of the luminance profile, or at zero-crossings in the 2nd derivative. However, when presented with blurred triangle-wave images, observers consistently marked edges not at these locations, but at peaks in the 3rd derivative. This new phenomenon, termed ‘Mach edges’ persisted when a luminance ramp was added to the blurred triangle-wave. Modelling of these Mach edge detection data required the addition of a physiologically plausible filter, prior to the 3rd derivative computation. A viable alternative model was examined, on the basis of data obtained with short-duration, high spatial-frequency stimuli. Detection and feature-making methods were used to examine the perception of Mach bands in an image set that spanned a range of Mach band detectabilities. A scale-space model that computed edge and bar features in parallel provided a better fit to the data than 4 competing models that combined information across scale in a different manner, or computed edge or bar features at a single scale. The perception of luminance bars was examined in 2 experiments. Data for one image-set suggested a simple rule for perception of a small Gaussian bar on a larger inverted Gaussian bar background. In previous research, discriminability (d’) has typically been reported to be a power function of contrast, where the exponent (p) is 2 to 3. However, using bar, grating, and Gaussian edge stimuli, with several methodologies, values of p were obtained that ranged from 1 to 1.7 across 6 experiments. This novel finding was explained by appealing to low stimulus uncertainty, or a near-linear transducer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

linearity management is explored as a complete tool to obtain maximum transmission reach in a WDM fiber transmission system, making it possible to optimize multiple system parameters, including optimal dispersion pre-compensation, with fast simulations based on the continuous-wave approximation.