868 resultados para Microcontroller-based systems
Resumo:
Since much knowledge is tacit, eliciting knowledge is a common bottleneck during the development of knowledge-based systems. Visual interactive simulation (VIS) has been proposed as a means for eliciting experts’ decision-making by getting them to interact with a visual simulation of the real system in which they work. In order to explore the effectiveness and efficiency of VIS based knowledge elicitation, an experiment has been carried out with decision-makers in a Ford Motor Company engine assembly plant. The model properties under investigation were the level of visual representation (2-dimensional, 2½-dimensional and 3-dimensional) and the model parameter settings (unadjusted and adjusted to represent more uncommon and extreme situations). The conclusion from the experiment is that using a 2-dimensional representation with adjusted parameter settings provides the better simulation-based means for eliciting knowledge, at least for the case modelled.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.
Resumo:
This research primarily focused on identifying the formulation parameters which control the efficacy of liposomes as delivery systems to enhance the delivery of poorly soluble drugs. Preliminary studies focused on the drug loading of ibuprofen within vesicle systems. Initially both liposomal and niosomal formulations were screened for their drug-loading capacity: liposomal systems were shown to offer significantly higher ibuprofen loading and thereafter lipid based systems were further investigated. Given the key role cholesterol is known to play within the stability of bilayer vesicles. the optimum cholesterol content in terms of drug loading and release of poorly soluble drugs was then investigated. From these studies a concentration of 11 total molar % of cholesterol was used as a benchmark for all further formulations. Investigating the effect of liposomc composition on several low solubility drugs, drug loading was shown to be enhanced by adopting longer chain length lipids. cationic lipids and. decreasing drug molecular weight. Drug release was increased by using cationic lipids and lower molecular weight of drug; conversely, a reduction was noted when employing longer chain lipids thus supporting the rational of longer chain lipids producing more stable liposomes, a theory also supported by results obtained via Langmuir studies· although it was revealed that stability is also dependent on geometric features associated with the lipid chain moiety. Interestingly, reduction in drug loading appeared to be induced when symmetrical phospholipids were substituted for lipids constituting asymmetrical alkyl chain groups thus further highlighting the importance of lipid geometry. Combining a symmetrical lipid with an asymmetrical derivative enhanced encapsulation of a hydrophobic drug while reducing that of another suggesting the importance of drug characteristics. Phosphatidylcholine liposornes could successfully be prepared (and visualised using transmission electron microscopy) from fatty alcohols therefore offering an alternative liposomal stabiliser to cholesterol. Results obtained revealed that liposomes containing tetradecanol within their formulation shares similar vesicle size, drug encapsulation, surface charge. and toxicity profiles as liposomes formulated with cholesterol, however the tetradecanol preparation appeared to release considerably more drug during stability studies. Langmuir monolayer studies revealed that the condensing influence by tetradecanol is less than compared with cholesterol suggesting that this reduced intercalation by the former could explain why the tetradecanol formulation released more drug compared with cholesterol formulations. Environmental scanning electron microscopy (ESEM) was used to analyse the morphology and stability of liposomes. These investigations indicated that the presence of drugs within the liposomal bilayer were able to enhance the stability of the bilayers against collapse under reduced hydration conditions. In addition the presence of charged lipids within the formulation under reduced hydration conditions compared with its neutral counterpart. However the applicability of using ESEM as a new method to investigate liposome stability appears less valid than first hoped since the results are often open to varied interpretation and do not provide a robust set of data to support conclusions in some cases.
Resumo:
Despite expectations being high, the industrial take-up of Semantic Web technologies in developing services and applications has been slower than expected. One of the main reasons is that many legacy systems have been developed without considering the potential of theWeb in integrating services and sharing resources.Without a systematic methodology and proper tool support, the migration from legacy systems to SemanticWeb Service-based systems can be a tedious and expensive process, which carries a significant risk of failure. There is an urgent need to provide strategies, allowing the migration of legacy systems to Semantic Web Services platforms, and also tools to support such strategies. In this paper we propose a methodology and its tool support for transitioning these applications to Semantic Web Services, which allow users to migrate their applications to Semantic Web Services platforms automatically or semi-automatically. The transition of the GATE system is used as a case study. © 2009 - IOS Press and the authors. All rights reserved.
Resumo:
Self-adaptation is emerging as an increasingly important capability for many applications, particularly those deployed in dynamically changing environments, such as ecosystem monitoring and disaster management. One key challenge posed by Dynamically Adaptive Systems (DASs) is the need to handle changes to the requirements and corresponding behavior of a DAS in response to varying environmental conditions. Berry et al. previously identified four levels of RE that should be performed for a DAS. In this paper, we propose the Levels of RE for Modeling that reify the original levels to describe RE modeling work done by DAS developers. Specifically, we identify four types of developers: the system developer, the adaptation scenario developer, the adaptation infrastructure developer, and the DAS research community. Each level corresponds to the work of a different type of developer to construct goal model(s) specifying their requirements. We then leverage the Levels of RE for Modeling to propose two complementary processes for performing RE for a DAS. We describe our experiences with applying this approach to GridStix, an adaptive flood warning system, deployed to monitor the River Ribble in Yorkshire, England.
Resumo:
The paper discusses both the complementary factors and contradictions of adoption ERP based systems with enterprise 2.0. ERP is well known as its' efficient business process management. Also the high failure rate the system implementation is famous as well. According to [1], ERP systems could achieve efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. However, enterprise 2.0 supports flexible business process management, informal and less structured interactions [3],[4],[21]. Traditional researcher claimed efficiency and flexibility may seem incompatible in that they are different business objectives and may exist in different organizational environments. However, the paper will break traditional norms that combine ERP and enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on the multiple cases studies, four cases presented different attitudes on usage ERP systems and enterprise social systems. Based on socio-technical theory, the paper presents in-depth analysis benefits of combination ERP with enterprise 2.0 for these firms.
Resumo:
Over the past decade or so a number of changes have been observed in traditional Japanese employment relations (ERs) systems such as an increase in non-regular workers, a move towards performance-based systems and a continuous decline in union membership. There is a large body of Anglo-Saxon and Japanese literature providing evidence that national factors such as national institutions, national culture, and the business and economic environment have significantly influenced what were hitherto three ‘sacred’ aspects of Japanese ERs systems (ERSs). However, no research has been undertaken until now at the firm level regarding the extent to which changes in national factors influence ERSs across firms. This article develops a model to examine the impact of national factors on ER systems; and analyses the impact of national factors at the firm level ER systems. Based on information collected from two different groups of companies, namely Mitsubishi Chemical Group (MCG) and Federation of Shinkin Bank (FSB) the research finds that except for a few similarities, the impact of national factors is different on Japanese ER systems at the firm level. This indicates that the impact of national factors varies in the implementation of employment relations factors. In the case of MCG, national culture has less to do with seniority-based system. Study also reveals that the national culture factors have also less influence on an enterprise-based system in the case of FSB. This analysis is useful for domestic and international organizations as it helps to better understand the role of national factors in determining Japanese ERSs.
Resumo:
The project consists of an experimental and numerical modelling study of the applications of ultra-long Raman fibre laser (URFL) based amplification techniques for high-speed multi-wavelength optical communications systems. The research is focused in telecommunications C-band 40 Gb/s transmission data rates with direct and coherent detection. The optical transmission performance of URFL based systems in terms of optical noise, gain bandwidth and gain flatness for different system configurations is evaluated. Systems with different overall span lengths, transmission fibre types and data modulation formats are investigated. Performance is compared with conventional Erbium doped fibre amplifier based system to evaluate system configurations where URFL based amplification provide performance or commercial advantages.
Resumo:
The chapter discusses both the complementary factors and contradictions of adoption ERP-based systems with Enterprise 2.0. ERP is well known as IT's efficient business process management. Enterprise 2.0 supports flexible business process management, informal, and less structured interactions. Traditional studies indicate efficiency and flexibility may seem incompatible because they are different business objectives and may exist in different organizational environments. However, the chapter breaks traditional norms that combine ERP and Enterprise 2.0 in a single enterprise to improve both efficient and flexible operations simultaneously. Based on multiple case studies, the chapter analyzes the benefits and risks of the combination of ERP with Enterprise 2.0 from process, organization, and people paradigms. © 2013 by IGI Global.
Resumo:
The paper develops a set of ideas and techniques supporting analogical reasoning throughout the life-cycle of terrorist acts. Implementation of these ideas and techniques can enhance the intellectual level of computer-based systems for a wide range of personnel dealing with various aspects of the problem of terrorism and its effects. The method combines techniques of structure-sensitive distributed representations in the framework of Associative-Projective Neural Networks, and knowledge obtained through the progress in analogical reasoning, in particular the Structure Mapping Theory. The impact of these analogical reasoning tools on the efforts to minimize the effects of terrorist acts on civilian population is expected by facilitating knowledge acquisition and formation of terrorism-related knowledge bases, as well as supporting the processes of analysis, decision making, and reasoning with those knowledge bases for users at various levels of expertise before, during, and after terrorist acts.
Resumo:
* This paper was made according to the program № 14 of fundamental scientific research of the Presidium of the Russian Academy of Sciences, the project 06-I-П14-052
Resumo:
Conventional tools for measurement of laser spectra (e.g. optical spectrum analysers) capture data averaged over a considerable time period. However, the generation spectrum of many laser types may involve spectral dynamics whose relatively fast time scale is determined by their cavity round trip period, calling for instrumentation featuring both high temporal and spectral resolution. Such real-time spectral characterisation becomes particularly challenging if the laser pulses are long, or they have continuous or quasi-continuous wave radiation components. Here we combine optical heterodyning with a technique of spatiooral intensity measurements that allows the characterisation of such complex sources. Fast, round-trip-resolved spectral dynamics of cavity-based systems in real-time are obtained, with temporal resolution of one cavity round trip and frequency resolution defined by its inverse (85 ns and 24 MHz respectively are demonstrated). We also show how under certain conditions for quasi-continuous wave sources, the spectral resolution could be further increased by a factor of 100 by direct extraction of phase information from the heterodyned dynamics or by using double time scales within the spectrogram approach.
Resumo:
The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.
Resumo:
The BlackEnergy malware targeting critical infrastructures has a long history. It evolved over time from a simple DDoS platform to a quite sophisticated plug-in based malware. The plug-in architecture has a persistent malware core with easily installable attack specific modules for DDoS, spamming, info-stealing, remote access, boot-sector formatting etc. BlackEnergy has been involved in several high profile cyber physical attacks including the recent Ukraine power grid attack in December 2015. This paper investigates the evolution of BlackEnergy and its cyber attack capabilities. It presents a basic cyber attack model used by BlackEnergy for targeting industrial control systems. In particular, the paper analyzes cyber threats of BlackEnergy for synchrophasor based systems which are used for real-time control and monitoring functionalities in smart grid. Several BlackEnergy based attack scenarios have been investigated by exploiting the vulnerabilities in two widely used synchrophasor communication standards: (i) IEEE C37.118 and (ii) IEC 61850-90-5. Specifically, the paper addresses reconnaissance, DDoS, man-in-the-middle and replay/reflection attacks on IEEE C37.118 and IEC 61850-90-5. Further, the paper also investigates protection strategies for detection and prevention of BlackEnergy based cyber physical attacks.
Resumo:
In this paper, we present a case-based reasoning (CBR) approach solving educational time-tabling problems. Following the basic idea behind CBR, the solutions of previously solved problems are employed to aid finding the solutions for new problems. A list of feature-value pairs is insufficient to represent all the necessary information. We show that attribute graphs can represent more information and thus can help to retrieve re-usable cases that have similar structures to the new problems. The case base is organised as a decision tree to store the attribute graphs of solved problems hierarchically. An example is given to illustrate the retrieval, re-use and adaptation of structured cases. The results from our experiments show the effectiveness of the retrieval and adaptation in the proposed method.