27 resultados para Frame-timing
em Aston University Research Archive
The Long-Term impact of Business Support? - Exploring the Role of Evaluation Timing using Micro Data
Resumo:
The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.
The long-term impact of business support? - Exploring the role of evaluation timing using micro data
Resumo:
The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.
Resumo:
Knowledge maintenance is a major challenge for both knowledge management and the Semantic Web. Operating over the Semantic Web, there will be a network of collaborating agents, each with their own ontologies or knowledge bases. Change in the knowledge state of one agent may need to be propagated across a number of agents and their associated ontologies. The challenge is to decide how to propagate a change of knowledge state. The effects of a change in knowledge state cannot be known in advance, and so an agent cannot know who should be informed unless it adopts a simple ‘tell everyone – everything’ strategy. This situation is highly reminiscent of the classic Frame Problem in AI. We argue that for agent-based technologies to succeed, far greater attention must be given to creating an appropriate model for knowledge update. In a closed system, simple strategies are possible (e.g. ‘sleeping dog’ or ‘cheap test’ or even complete checking). However, in an open system where cause and effect are unpredictable, a coherent cost-benefit based model of agent interaction is essential. Otherwise, the effectiveness of every act of knowledge update/maintenance is brought into question.
Resumo:
This rejoinder reflects an important step, for me, in a preoccupation with methodology that has provided me with many hours of enjoyable reading, not to mention anxiety. For me the ‘reality’ of the incommensurable nature of paradigms and acceptance of the legitimacy of a range of conceptual and philosophical traditions came late. As a constructionist I find myself on the ‘anything goes’ end of methodology choice. This paper and my main paper ought not to be read as a critique of ‘middle range’ theory, but as a critique of an important and necessary aspect of the way we all seek to inscribe facts and structure our writing. What follows is a reflection of the influence Bruno Latour’s writings have had on my ways of seeing and perhaps an unhealthy emphasis on the small things that combine to produce convincing arguments and ‘facts’.
Resumo:
We demonstrate a novel dual-wavelength erbium-fiber laser that uses a single nonlinear-optical loop mirror modulator to simultaneously modelock two cavities with chirped fiber Bragg gratings as end mirrors. We show that this configuration produces synchronized soliton pulse trains with an ultra-low RMS inter-pulse-stream timing jitter of 620 fs enabling application to multiwavelength systems at data rates in excess of 130 Gb/s.
Resumo:
This study examines whether the timing of adoption of the UK Statement of Standard Accounting Practice No. 20 'Foreign Currency Translation' depended on firms' financial characteristics. Consistent with US studies, we find that early adopters tended to be larger firms, and that variables, such as growth options, profitability, leverage and management payout, have strong predictive power. In general, the decision to adopt the Statement of Standard Accounting Practice No. 20 did not appear to adversely affect the profitability measures or dividend payout. Firms tended to adopt when the adverse economic consequences of the adoption were likely to be minimal. They also appeared to defer the adoption of the standard to influence their financial performance and, hence, to achieve certain corporate financial objectives. © 2006 AFAANZ.
Resumo:
Bilateral corneal blindness represents a quarter of the total blind, world-wide. The artificial cornea in assorted forms, was developed to replace opaque non-functional corneas and to return sight in otherwise hopeless cases that were not amenable to corneal grafts; believed to be 2% of corneal blind. Despite technological advances in materials design and tissue engineering no artificial cornea has provided absolute, long-term success. Formidable problems exist, due to a combination of unpredictable wound healing and unmanageable pathology. To have a solid guarantee of reliable success an artificial cornea must possess three attributes: an optical window to replace the opaque cornea; a strong, long term union to surrounding ocular tissue; and the ability to induce desired host responses. A unique artificial cornea possesses all three functional attributes- the Osteo-odonto-keratoprosthesis (OOKP). The OOKP has a high success rate and can survive for up to twenty years, but it is complicated both in structure and in surgical procedure; it is expensive and not universally available. The aim of this project was to develop a synthetic substitute for the OOKP, based upon key features of the tooth and bone structure. In doing so, surgical complexity and biological complications would be reduced. Analysis of the biological effectiveness of the OOKP showed that the structure of bone was the most crucial component for implant retention. An experimental semi-rigid hydroxyapatite framework was fabricated with a complex bone-like architecture, which could be fused to the optical window. The first method for making such a framework, was pressing and sintering of hydroxyapatite powders; however, it was not possible to fabricate a void architecture with the correct sizes and uniformity of pores. Ceramers were synthesised using alternative pore forming methods, providing for improved mechanical properties and stronger attachment to the plastic optical window. Naturally occurring skeletal structures closely match the structural features of all forms of natural bone. Synthetic casts were fabricated using the replamineform process, of desirable natural artifacts, such as coral and sponges. The final method of construction by-passed ceramic fabrication in favour of pre-formed coral derivatives and focused on methods for polymer infiltration, adhesion and fabrication. Prototypes were constructed and evaluated; a fully penetrative synthetic OOKP analogue was fabricated according to the dimensions of the OOKP. Fabrication of the cornea shaped OOKP synthetic analogue was also attempted.
Resumo:
This thesis examines theoretically and experimentally the behaviour of a temporary end plate connection for an aluminium space frame structure, subjected to static loading conditions. Theoretical weld failure criterions are derived from basic fundamentals for both tensile and shear fillet welds. Direct account of weld penetration is taken by incorporating it into a more exact poposed weld model. Theoretical relationships between weld penetration and weld failure loads, failure planes and failure lengths are derived. Also, the variation in strength between tensile and shear fillet welds is shown to be dependent upon the extent of weld penetration achieved/ The proposed tensile weld failure theory is extended to predict the theoretical failure of the welds in the end plate space frame connection. A finite element analysis is conducted to verify the assumptions made for this theory. Experimental hardness and tensile tests are conducted to substantiate the extent and severity of the heat affected zone in aluminium alloy 6082-T6. Simple transverse and longitudinal fillet welded specimens of the same alloy, are tested to failure. These results together with those of other authors are compared to the theoretical predictions made by the proposed weld failure theories and by those made using Kamtekar's and Kato and Morita's failure equations, the -formula and BS 8118. Experimental tests are also conducted on the temporary space frame connection. The maximum stresses and displacements recorded are checked against results obtained from a finite element analysis of the connection. Failure predictions made by the proposed extended weld failure theory, are compared against the experimental results.
Resumo:
This thesis encompasses an investigation of the behaviour of concrete frame structure under localised fire scenarios by implementing a constitutive model using finite-element computer program. The investigation phase included properties of material at elevated temperature, description of computer program, thermal and structural analyses. Transient thermal properties of material have been employed in this study to achieve reasonable results. The finite-element computer package of ANSYS is utilized in the present analyses to examine the effect of fire on the concrete frame under five various fire scenarios. In addition, a report of full-scale BRE Cardington concrete building designed to Eurocode2 and BS8110 subjected to realistic compartment fire is also presented. The transient analyses of present model included additional specific heat to the base value of dry concrete at temperature 100°C and 200°C. The combined convective-radiation heat transfer coefficient and transient thermal expansion have also been considered in the analyses. For the analyses with the transient strains included, the constitutive model based on empirical formula in a full thermal strain-stress model proposed by Li and Purkiss (2005) is employed. Comparisons between the models with and without transient strains included are also discussed. Results of present study indicate that the behaviour of complete structure is significantly different from the behaviour of individual isolated members based on current design methods. Although the current tabulated design procedures are conservative when the entire building performance is considered, it should be noted that the beneficial and detrimental effects of thermal expansion in complete structures should be taken into account. Therefore, developing new fire engineering methods from the study of complete structures rather than from individual isolated member behaviour is essential.
Resumo:
Sensorimotor synchronization is hypothesized to arise through two different processes, associated with continuous or discontinuous rhythmic movements. This study investigated synchronization of continuous and discontinuous movements to different pacing signals (auditory or visual), pacing interval (500, 650, 800, 950 ms) and across effectors (non-dominant vs. non-dominant hand). The results showed that mean and variability of asynchronization errors were consistently smaller for discontinuous movements compared to continuous movements. Furthermore, both movement types were timed more accurately with auditory pacing compared to visual pacing and were more accurate with the dominant hand. Shortening the pacing interval also improved sensorimotor synchronization accuracy in both continuous and discontinuous movements. These results show the dependency of temporal control of movements on the nature of the motor task, the type and rate of extrinsic sensory information as well as the efficiency of the motor actuators for sensory integration.
Resumo:
Motor timing tasks have been employed in studies of neurodevelopmental disorders such as developmental dyslexia and ADHD, where they provide an index of temporal processing ability. Investigations of these disorders have used different stimulus parameters within the motor timing tasks which are likely to affect performance measures. Here we assessed the effect of auditory and visual pacing stimuli on synchronised motor timing performance and its relationship with cognitive and behavioural predictors that are commonly used in the diagnosis of these highly prevalent developmental disorders. Twenty- one children (mean age 9.6 years) completed a finger tapping task in two stimulus conditions, together with additional psychometric measures. As anticipated, synchronisation to the beat (ISI 329 ms) was less accurate in the visually paced condition. Decomposition of timing variance indicated that this effect resulted from differences in the way that visual and auditory paced tasks are processed by central timekeeping and associated peripheral implementation systems. The ability to utilise an efficient processing strategy on the visual task correlated with both reading and sustained attention skills. Dissociations between these patterns of relationship across task modality suggest that not all timing tasks are equivalent.