29 resultados para Election Counting and Reporting Software,
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.
Resumo:
This thesis describes an investigation into methods for controlling the mode distribution in multimode optical fibres. The major contributions presented in this thesis are summarised below. Emerging standards for Gigabit Ethernet transmission over multimode optical fibre have led to a resurgence of interest in the precise control, and specification, of modal launch conditions. In particular, commercial LED and OTDR test equipment does not, in general, comply with these standards. There is therefore a need for mode control devices, which can ensure compliance with the standards. A novel device consisting of a point-load mode-scrambler in tandem with a mode-filter is described in this thesis. The device, which has been patented, may be tuned to achieve a wide range of mode distributions and has been implemented in a ruggedised package for field use. Various other techniques for mode control have been described in this work, including the use of Long Period Gratings and air-gap mode-filters. Some of the methods have been applied to other applications, such as speckle suppression and in sensor technology. A novel, self-referencing, sensor comprising two modal groups in the Mode Power Distribution has been designed and tested. The feasibility of a two-channel Mode Group Diversity Multiplexed system has been demonstrated over 985m. A test apparatus for measuring mode distribution has been designed and constructed. The apparatus consists of a purpose-built video microscope, and comprehensive control and analysis software written in Visual Basic. The system may be fitted with a Silicon camera or an InGaAs camera, for measurement in the 850nm and 130nm transmission windows respectively. A limitation of the measurement method, when applied to well-filled fibres, has been identified and an improvement to the method has been proposed, based on modelled Laguerre Gauss field solutions.
Resumo:
This professional doctoral research reports on the relationship between Enterprise Systems, specifically Enterprise Resource Planning Systems, and enterprise structures. It offers insights and guidance to practitioners on factors for consideration in the implementation of ERP systems in organisations operating in modern enterprise structures. It reports on reflective ethnographic action research conducted in a number of companies from a diverse range of industries covering supply chains for both goods and services. The primary contribution is in highlighting areas in which clients, practitioners and ERP software vendors can bring a greater awareness of internet era enterprise structures and business requirements into the ERP arena. The concepts and insights have been explored in a focus group setting, comprised of practitioners from the enterprise systems implementation and consulting fraternity and revealed limitations and constraints in the implementation of enterprise systems. However, it also showed that current systems do not have the full capabilities required to support, in use, modern era enterprise structures, as required by practitioners and decision makers.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
The electoral challenge of the far right is an enduringly problematic feature of contemporary French politics. In the first rounds of the 2012 presidential and parliamentary elections, the Front National (FN) under new leader Marine Le Pen attracted a combined total of ten million votes, bringing its ultra-nationalist policies to the centre of national political debate. This article examines the FN's impact on these elections and its implications for French politics. Drawing on official FN programmes, detailed election results and a range of opinion polling data, it assesses the strength of support for Le Pen and her party and seeks to explain their electoral appeal. In particular, it subjects to analysis the claim that the new leader has ‘de-demonised’ the FN, transforming it from perennial outsider to normal participant in mainstream French politics; and it reflects on the strategic dilemma posed for the centre-right by this newly invigorated far-right challenge.
Resumo:
Research Question/Issue: In this paper, we empirically investigate whether US listed commercial banks with effective corporate governance structures engage in higher levels of conservative financial accounting and reporting. Research Findings/Insights: Using both market- and accrual-based measures of conservatism and both composite and disaggregated governance indices, we document convincing evidence that well-governed banks engage in significantly higher levels of conditional conservatism in their financial reporting practices. For example, we find that banks with effective governance structures, particularly those with effective board and audit governance structures, recognize loan loss provisions that are larger relative to changes in nonperforming loans compared to their counterparts with ineffective governance structures. Theoretical/Academic Implications: We contribute to the extant literature on the relationship between corporate governance and quality of accounting information by providing evidence that banks with effective governance structures practice higher levels of accounting conservatism. Practitioner/Policy Implications: The findings of this study would be useful to US bank regulators/supervisors in improving the existing regulatory framework by focusing on accounting conservatism as a complement to corporate governance in mitigating the opaqueness and intense information asymmetry that plague banks.
Resumo:
Purpose – The purpose of this paper is to investigate the “last mile” delivery link between a hub and spoke distribution system and its customers. The proportion of retail, as opposed to non-retail (trade) customers using this type of distribution system has been growing in the UK. The paper shows the applicability of simulation to demonstrate changes in overall delivery policy to these customers. Design/methodology/approach – A case-based research method was chosen with the aim to provide an exemplar of practice and test the proposition that simulation can be used as a tool to investigate changes in delivery policy. Findings – The results indicate the potential improvement in delivery performance, specifically in meeting timed delivery performance, that could be made by having separate retail and non-retail delivery runs from the spoke terminal to the customer. Research limitations/implications – The simulation study does not attempt to generate a vehicle routing schedule but demonstrates the effects of a change on delivery performance when comparing delivery policies. Practical implications – Scheduling and spreadsheet software are widely used and provide useful assistance in the design of delivery runs and the allocation of staff to those delivery runs. This paper demonstrates to managers the usefulness of investigating the efficacy of current design rules and presents simulation as a suitable tool for this analysis. Originality/value – A simulation model is used in a novel application to test a change in delivery policy in response to a changing delivery profile of increased retail deliveries.
Resumo:
Strategic sourcing has increased in importance in recent years, and now plays an important role in companies’ planning. The current volatility in supply markets means companies face multiple challenges involving lock-in situations, supplier bankruptcies or supply security issues. In addition, their exposure can increase due to natural disasters, as witnessed recently in the form of bird flu, volcanic ash and tsunamis. Therefore, the primary focus of this study is risk management in the context of strategic sourcing. The study presents a literature review on sourcing based on the 15 years from 1998–2012, and considers 131 academic articles. The literature describes strategic sourcing as a strategic, holistic process in managing supplier relationships, with a long-term focus on adding value to the company and realising competitive advantage. Few studies discovered the real risk impact and status of risk management in strategic sourcing, and evaluation across countries and industries was limited, with the construction sector particularly under-researched. This methodology is founded on a qualitative study of twenty cases across Ger-many and the United Kingdom from the construction sector and electronics manufacturing industries. While considering risk management in the context of strategic sourcing, the thesis takes into account six dimensions that cover trends in strategic sourcing, theoretical and practical sourcing models, risk management, supply and demand management, critical success factors and the strategic supplier evaluation. The study contributes in several ways. First, recent trends are traced and future needs identified across the research dimensions of countries, industries and companies. Second, it evaluates critical success factors in contemporary strategic sourcing. Third, it explores the application of theoretical and practical sourcing models in terms of effectiveness and sustainability. Fourth, based on the case study findings, a risk-oriented strategic sourcing framework and a model for strategic sourcing are developed. These are based on the validation of contemporary requirements and a critical evaluation of the existing situation. It contemplates the empirical findings and leads to a structured process to manage risk in strategic sourcing. The risk-oriented framework considers areas such as trends, corporate and sourcing strategy, critical success factors, strategic supplier selection criteria, risk assessment, reporting, strategy alignment and reporting. The proposed model highlights the essential dimensions in strategic sourcing and guides us to a new definition of strategic sourcing supported by this empirical study.
Resumo:
Background. Previous research has found links between being a victim of bullying and reporting more unhealthy eating behaviours and cognitions, particularly in girls. However, little is known about the factors that might mediate these relationships. Aim. The present study compared the relationships between bullying, emotional adjustment, restrained eating, and body dissatisfaction in adolescent boys and girls. Sample/method. Self-report data were collected from a sample of 11- to 14-year-olds (N= 376) on experiences of bullying, emotional symptoms, and unhealthy eating and shape-related attitudes and behaviours. Results. Bullying, emotional symptoms, restrained eating, and body dissatisfaction were all correlated. Emotional symptoms were found to significantly mediate the relationships between verbal bullying with body dissatisfaction in girls but not in boys. Conclusions. Findings suggest that the experience of being verbally bullied places adolescent girls at risk of developing emotional problems which can then lead to body dissatisfaction. Longitudinal research is necessary to disentangle these pathways in more detail to facilitate the development of informed interventions to support children who are being bullied.
Resumo:
Background: Proton Magnetic Resonance Spectroscopy (H-MRS) is a non-invasive imaging technique that enables quantification of neurochemistry in vivo and thereby facilitates investigation of the biochemical underpinnings of human cognitive variability. Studies in the field of cognitive spectroscopy have commonly focused on relationships between measures of N-acetyl aspartate (NAA), a surrogate marker of neuronal health and function, and broad measures of cognitive performance, such as IQ. Methodology/Principal Findings: In this study, we used H-MRS to interrogate single-voxels in occipitoparietal and frontal cortex, in parallel with assessments of psychometric intelligence, in a sample of 40 healthy adult participants. We found correlations between NAA and IQ that were within the range reported in previous studies. However, the magnitude of these effects was significantly modulated by the stringency of data screening and the extent to which outlying values contributed to statistical analyses. Conclusions/Significance: H-MRS offers a sensitive tool for assessing neurochemistry non-invasively, yet the relationships between brain metabolites and broad aspects of human behavior such as IQ are subtle. We highlight the need to develop an increasingly rigorous analytical and interpretive framework for collecting and reporting data obtained from cognitive spectroscopy studies of this kind. © 2014 Patel, Blyth, Griffiths, Kelly and Talcott.
Resumo:
Changes in the international economic scenario in recent years have made it necessary for both industrial and service firms to reformulate their strategies, with a strong focus on the resources required for successful implementation. In this scenario, information and communication technologies (ICT) has a potentially vital role to play both as a key resource for re-engineering business processes within a framework of direct connection between suppliers and customers, and as a source of cost optimisation. There have also been innovations in the logistics and freight transport industry in relation to ICT diffusion. The implementation of such systems by third party logistics providers (3PL) allows the real-time exchange of information between supply chain partners, thereby improving planning capability and customer service. Unlike other industries, the logistics and freight transport industry is lagging somewhat behind other sectors in ICT diffusion. This situation is to be attributed to a series of both industry-specific and other factors, such as: (a) traditional resistance to change on the part of transport and logistics service providers; (b) the small size of firms that places considerable constraints upon investment in ICT; (c) the relative shortage of user-friendly applications; (d) the diffusion of internal standards on the part of the main providers in the industry whose aim is to protect company information, preventing its dissemination among customers and suppliers; (e) the insufficient degree of professional skills for using such technologies on the part of staff in such firms. The latter point is of critical importance insofar as the adoption of ICT is making it increasingly necessary both to develop new technical skills to use different hardware and new software tools, and to be able to plan processes of communication so as to allow the optimal use of ICT. The aim of this paper is to assess the impact of ICT on transport and logistics industry and to highlight how the use of such new technologies is affecting providers' training needs. The first part will provide a conceptual framework of the impact of ICT on the transport and logistics industry. In the second part the state of ICT dissemination in the Italian and Irish third party logistics industry will be outlined. In the third part, the impact of ICT on the training needs of transport and logistics service providers - based on case studies in both countries - are discussed. The implications of the foregoing for the development of appropriate training policies are considered. For the covering abstract see ITRD E126595.
Resumo:
Many software engineers have found that it is difficult to understand, incorporate and use different formal models consistently in the process of software developments, especially for large and complex software systems. This is mainly due to the complex mathematical nature of the formal methods and the lack of tool support. It is highly desirable to have software models and their related software artefacts systematically connected and used collaboratively, rather than in isolation. The success of the Semantic Web, as the next generation of Web technology, can have profound impact on the environment for formal software development. It allows both the software engineers and machines to understand the content of formal models and supports more effective software design in terms of understanding, sharing and reusing in a distributed manner. To realise the full potential of the Semantic Web in formal software development, effectively creating proper semantic metadata for formal software models and their related software artefacts is crucial. This paper proposed a framework that allows users to interconnect the knowledge about formal software models and other related documents using the semantic technology. We first propose a methodology with tool support is proposed to automatically derive ontological metadata from formal software models and semantically describe them. We then develop a Semantic Web environment for representing and sharing formal Z/OZ models. A method with prototype tool is presented to enhance semantic query to software models and other artefacts. © 2014.
Resumo:
Objectives: To develop a decision support system (DSS), myGRaCE, that integrates service user (SU) and practitioner expertise about mental health and associated risks of suicide, self-harm, harm to others, self-neglect, and vulnerability. The intention is to help SUs assess and manage their own mental health collaboratively with practitioners. Methods: An iterative process involving interviews, focus groups, and agile software development with 115 SUs, to elicit and implement myGRaCE requirements. Results: Findings highlight shared understanding of mental health risk between SUs and practitioners that can be integrated within a single model. However, important differences were revealed in SUs' preferred process of assessing risks and safety, which are reflected in the distinctive interface, navigation, tool functionality and language developed for myGRaCE. A challenge was how to provide flexible access without overwhelming and confusing users. Conclusion: The methods show that practitioner expertise can be reformulated in a format that simultaneously captures SU expertise, to provide a tool highly valued by SUs. A stepped process adds necessary structure to the assessment, each step with its own feedback and guidance. Practice Implications: The GRiST web-based DSS (www.egrist.org) links and integrates myGRaCE self-assessments with GRiST practitioner assessments for supporting collaborative and self-managed healthcare.
Resumo:
With the cell therapy industry continuing to grow, the ability to preserve clinical grade cells, including mesenchymal stem cells (MSCs), whilst retaining cell viability and function remains critical for the generation of off-the-shelf therapies. Cryopreservation of MSCs, using slow freezing, is an established process at lab scale. However, the cytotoxicity of cryoprotectants, like Me2SO, raises questions about the impact of prolonged cell exposure to cryoprotectant at temperatures >0 °C during processing of large cell batches for allogenic therapies prior to rapid cooling in a controlled rate freezer or in the clinic prior to administration. Here we show that exposure of human bone marrow derived MSCs to Me2SO for ≥1 h before freezing, or after thawing, degrades membrane integrity, short-term cell attachment efficiency and alters cell immunophenotype. After 2 h's exposure to Me2SO at 37 °C post-thaw, membrane integrity dropped to ∼70% and only ∼50% of cells retained the ability to adhere to tissue culture plastic. Furthermore, only 70% of the recovered MSCs retained an immunophenotype consistent with the ISCT minimal criteria after exposure. We also saw a similar loss of membrane integrity and attachment efficiency after exposing osteoblast (HOS TE85) cells to Me2SO before, and after, cryopreservation. Overall, these results show that freezing medium exposure is a critical determinant of product quality as process scale increases. Defining and reporting cell sensitivity to freezing medium exposure, both before and after cryopreservation, enables a fair judgement of how scalable a particular cryopreservation process can be, and consequently whether the therapy has commercial feasibility.