21 resultados para Electronic data interchange

em Greenwich Academic Literature Archive - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Collaborative approaches in leadership and management are increasingly acknowledged to play a key role in successful institutions in the learning and skills sector (LSS) (Ofsted, 2004). Such approaches may be important in bridging the potential 'distance' (psychological, cultural, interactional and geographical) (Collinson, 2005) that may exist between 'leaders' and 'followers', fostering more democratic communal solidarity. This paper reports on a 2006-07 research project funded by the Centre for Excellence in Leadership (CEL) that aimed to collect and analyse data on 'collaborative leadership' (CL) in the learning and skills sector. The project investigated collaborative leadership and its potential for benefiting staff through trust and knowledge-sharing in communities of practice (CoPs). The project forms part of longer-term educational research investigating leadership in a collaborative inquiry process (Jameson et al., 2006). The research examined the potential for CL to benefit institutions, analysing respondents' understanding of and resistance to collaborative practices. Quantitative and qualitative data from senior managers and lecturers was analysed using electronic data in SPSS and Tropes Zoom. The project aimed to recommend systems and practices for more inclusive, diverse leadership (Lumby et al., 2005). Collaborative leadership has increasingly gained international prominence as emphasis shifted towards team leadership beyond zero-sum 'leadership'/ 'followership' polarities into more mature conceptions of shared leadership spaces, within which synergistic leadership spaces can be mediated. The relevance of collaboration within the LSS has been highlighted following a spate of recent government-driven policy developments in FE. The promotion of CL addresses concerns about the apparent 'remoteness' of some senior managers, and the 'neo-management' control of professionals which can increase 'distance' between leaders and 'followers' and may de-professionalise staff in an already disempowered sector. Positive benefit from 'collaborative advantage' tends to be assumed in idealistic interpretations of CL, but potential 'collaborative inertia' may be problematic in a sector characterised by rapid top-down policy changes and continuous external audit and surveillance. Constant pressure for achievement against goals leaves little time for democratic group negotiations, despite the desires of leaders to create a more collaborative ethos. Yet prior models of intentional communities of practice potentially offer promise for CL practice to improve group performance despite multiple constraints. The CAMEL CoP model (JISC infoNet, 2006) was linked to the project, providing one practical way of implementing CL within situated professional networks.The project found that a good understanding of CL was demonstrated by most respondents, who thought it could enable staff to share power and work in partnership to build trust and conjoin skills, abilities and experience to achieve common goals for the good of the sector. However, although most respondents expressed agreement with the concept and ideals of CL, many thought this was currently an idealistically democratic, unachievable pipe dream in the LSS. Many respondents expressed concerns with the 'audit culture' and authoritarian management structures in FE. While there was a strong desire to see greater levels of implementation of CL, and 'collaborative advantage' from the 'knowledge sharing benefit potential' of team leadership, respondents also strongly advised against the pitfalls of 'collaborative inertia'. A 'distance' between senior leadership views and those of staff lower down the hierarchy regarding aspects of leadership performance in the sector was reported. Finally, the project found that more research is needed to investigate CL and develop innovative methods of practical implementation within autonomous communities of professional practice.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Abstract not available

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents simulated computational fluid dynamics (CFD) results for comparison against experimental data. The performance of four turbulence models has been assessed for electronic application areas considering both fluid flow and heat transfer phenomenon. CFD is vast becoming a powerful and almost essential tool for design, development and optimization in engineering problems. However turbulence models remain to be the key problem issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the performance of the turbulence model employed together with the wall functions implemented. To be able to resolve the abrupt changes in the turbulent energy and other parameters near the wall a particularly fine mesh is necessary which unfortunately increases the computer storage capacity requirements. The objective of turbulence modelling is to enhance computational procdures of sufficient acccuracy and generality for engineers to anticipate the Reynolds stresses and the scalar transport terms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electronics industry is developing rapidly together with the increasingly complex problem of microelectronic equipment cooling. It has now become necessary for thermal design engineers to consider the problem of equipment cooling at some level. The use of Computational Fluid Dynamics (CFD) for such investigations is fast becoming a powerful and almost essential tool for the design, development and optimisation of engineering applications. However turbulence models remain a key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt fluctuations experienced by the turbulent energy and other parameters located at near wall regions and shear layers a particularly fine computational mesh is necessary which inevitably increases the computer storage and run-time requirements. This paper will discuss results from an investigation into the accuract of currently used turbulence models. Also a newly formulated transitional hybrid turbulence model will be introduced with comparisonsaagainst experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes modeling technology and its use in providing data governing the assembly and subsequent reliability of electronic chip components on printed circuit boards (PCBs). Products, such as mobile phones, camcorders, intelligent displays, etc., are changing at a tremendous rate where newer technologies are being applied to satisfy the demands for smaller products with increased functionality. At ever decreasing dimensions, and increasing number of input/output connections, the design of these components, in terms of dimensions and materials used, is playing a key role in determining the reliability of the final assembly. Multiphysics modeling techniques are being adopted to predict a range of interacting physics-based phenomena associated with the manufacturing process. For example, heat transfer, solidification, marangoni fluid flow, void movement, and thermal-stress. The modeling techniques used are based on finite volume methods that are conservative and take advantage of being able to represent the physical domain using an unstructured mesh. These techniques are also used to provide data on thermal induced fatigue which is then mapped into product lifetime predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electronics industry and the problems associated with the cooling of microelectronic equipment are developing rapidly. Thermal engineers now find it necessary to consider the complex area of equipment cooling at some level. This continually growing industry also faces heightened pressure from consumers to provide electronic product miniaturization, which in itself increases the demand for accurate thermal management predictions to assure product reliability. Computational fluid dynamics (CFD) is considered a powerful and almost essential tool for the design, development and optimization of engineering applications. CFD is now widely used within the electronics packaging design community to thermally characterize the performance of both the electronic component and system environment. This paper discusses CFD results for a large variety of investigated turbulence models. Comparison against experimental data illustrates the predictive accuracy of currently used models and highlights the growing demand for greater mathematical modelling accuracy with regards to thermal characterization. Also a newly formulated low Reynolds number (i.e. transitional) turbulence model is proposed with emphasis on hybrid techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper will discuss Computational Fluid Dynamics (CFD) results from an investigation into the accuracy of several turbulence models to predict air cooling for electronic packages and systems. Also new transitional turbulence models will be proposed with emphasis on hybrid techniques that use the k-ε model at an appropriate distance away from the wall and suitable models, with wall functions, near wall regions. A major proportion of heat emitted from electronic packages can be extracted by air cooling. This flow of air throughout an electronic system and the heat extracted is highly dependent on the nature of turbulence present in the flow. The use of CFD for such investigations is fast becoming a powerful and almost essential tool for the design, development and optimization of engineering applications. However turbulence models remain a key issue when tackling such flow phenomena. The reliability of CFD analysis depends heavily on the turbulence model employed together with the wall functions implemented. In order to resolve the abrupt fluctuations experienced by the turbulent energy and other parameters located at near wall regions and shear layers a particularly fine computational mesh is necessary which inevitably increases the computer storage and run-time requirements. The PHYSICA Finite Volume code was used for this investigation. With the exception of the k-ε and k-ω models which are available as standard within PHYSICA, all other turbulence models mentioned were implemented via the source code by the authors. The LVEL, LVEL CAP, Wolfshtein, k-ε, k-ω, SST and kε/kl models are described and compared with experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heat is extracted away from an electronic package by convection, conduction, and/or radiation. The amount of heat extracted by forced convection using air is highly dependent on the characteristics of the airflow around the package which includes its velocity and direction. Turbulence in the air is also important and is required to be modeled accurately in thermal design codes that use computational fluid dynamics (CFD). During air cooling the flow can be classified as laminar, transitional, or turbulent. In electronics systems, the flow around the packages is usually in the transition region, which lies between laminar and turbulent flow. This requires a low-Reynolds number numerical model to fully capture the impact of turbulence on the fluid flow calculations. This paper provides comparisons between a number of turbulence models with experimental data. These models included the distance from the nearest wall and the local velocity (LVEL), Wolfshtein, Norris and Reynolds, k-ε, k-ω, shear-stress transport (SST), and kε/kl models. Results show that in terms of the fluid flow calculations most of the models capture the difficult wake recirculation region behind the package reasonably well, although for packages whose heights cause a high degree of recirculation behind the package the SST model appears to struggle. The paper also demonstrates the sensitivity of the models to changes in the mesh density; this study is aimed specifically at thermal design engineers as mesh independent simulations are rarely conducted in an industrial environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a generic framework that can be used to describe study plans using meta-data. The context of this research and associated technologies and standards is presented. The approach adopted here has been developed within the mENU project that aims to provide a model for a European Networked University. The methodology for the design of the generic Framework is discussed and the main design requirements are presented. The approach adopted was based on a set of templates containing meta-data required for the description of programs of study and consisting of generic building elements annotated appropriately. The process followed to develop the templates is presented together with a set of evaluation criteria to test the suitability of the approach. The templates structure is presented and example templates are shown. A first evaluation of the approach has shown that the proposed framework can provide a flexible and competent means for the generic description of study plans for the purposes of a networked university.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evacuation analysis of passenger and commercial shipping can be undertaken using computer-based simulation tools such as maritimeEXODUS. These tools emulate human shipboard behaviour during emergency scenarios; however it is largely based around the behaviour of civilian passengers and fixtures and fittings of merchant vessels. If these tools and procedures are to be applied to naval vessels there is a clear requirement to understand the behaviour of well-trained naval personnel interacting with the fixtures and fittings that are exclusive to warships. Human factor trials using Royal Navy training facilities were recently undertaken to collect data to improve our understanding of the performance of naval personnel in warship environments. The trials were designed and conducted by staff from the Fire Safety Engineering Group (FSEG) of the University of Greenwich on behalf of the Sea Technology Group (STG), Defence Procurement Agency. The trials involved a selection of RN volunteers with sea-going experience in warships, operating and traversing structural components under different angles of heel. This paper describes the trials and some of the collected data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The passenger response time distributions adopted by the International Maritime Organisation (IMO)in their assessment of the assembly time for passanger ships involves two key assumptions. The first is that the response time distribution assumes the form of a uniform random distribution and the second concerns the actual response times. These two assumptions are core to the validity of the IMO analysis but are not based on real data, being the recommendations of an IMO committee. In this paper, response time data collected from assembly trials conducted at sea on a real passanger vessel using actual passangers are presented and discussed. Unlike the IMO specified response time distributions, the data collected from these trials displays a log-normal distribution, similar to that found in land based environments. Based on this data, response time distributions for use in the IMO assesmbly for the day and night scenarios are suggested

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Logit-Logistic (LL), Johnson's SB, and the Beta (GBD) are flexible four-parameter probability distribution models in terms of the (skewness-kurtosis) region covered, and each has been used for modeling tree diameter distributions in forest stands. This article compares bivariate forms of these models in terms of their adequacy in representing empirical diameter-height distributions from 102 sample plots. Four bivariate models are compared: SBB, the natural, well-known, and much-used bivariate generalization of SB; the bivariate distributions with LL, SB, and Beta as marginals, constructed using Plackett's method (LL-2P, etc.). All models are fitted using maximum likelihood, and their goodness-of-fits are compared using minus log-likelihood (equivalent to Akaike's Information Criterion, the AIC). The performance ranking in this case study was SBB, LL-2P, GBD-2P, and SB-2P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the methodologies employed in the collection and storage of first-hand accounts of evacuation experiences derived from face-to-face interviews with evacuees from the World Trade Center (WTC) Twin Towers complex on 11 September 2001. In particular the paper describes the development of the High-rise Evacuation Evaluation Database (HEED). This is a flexible qualitative research tool which contains the full transcribed interview accounts and coded evacuee experiences extracted from those transcripts. The data and information captured and stored in the HEED database is not only unique, but it provides a means to address current and emerging issues relating to human factors associated with the evacuation of high-rise buildings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the use of computer modelled versus directly experimentally determined fire hazard data for assessing survivability within buildings using evacuation models incorporating Fractionally Effective Dose (FED) models. The objective is to establish a link between effluent toxicity, measured using a variety of small and large scale tests, and building evacuation. For the scenarios under consideration, fire simulation is typically used to determine the time non-survivable conditions develop within the enclosure, for example, when smoke or toxic effluent falls below a critical height which is deemed detrimental to evacuation or when the radiative fluxes reach a critical value leading to the onset of flashover. The evacuation calculation would the be used to determine whether people within the structure could evacuate before these critical conditions develop.