874 resultados para Systems analysis.


Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new original method and CASE-tool of system analysis and modelling are represented. They are for the first time consistent with the requirements of object-oriented technology of informational systems design. They essentially facilitate the construction of organisational systems models and increase the quality of the organisational designing and basic technological processes of object application developing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Workflows are set of activities that implement and realise business goals. Modern business goals add extra requirements on workflow systems and their management. Workflows may cross many organisations and utilise services on a variety of devices and/or supported by different platforms. Current workflows are therefore inherently context-aware. Each context is governed and constrained by its own policies and rules to prevent unauthorised participants from executing sensitive tasks and also to prevent tasks from accessing unauthorised services and/or data. We present a sound and multi-layered design language for the design and analysis of secure and context aware workflows systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT The research presented in this thesis is concerned with Discrete-Event Simulation (DES) modelling as a method to facilitate logistical policy development within the UK Less-than-Truckload (LTL) freight distribution sector which has been typified by “Pallet Networks” operating on a hub-and-spoke philosophy. Current literature relating to LTL hub-and-spoke and cross-dock freight distribution systems traditionally examines a variety of network and hub design configurations. Each is consistent with classical notions of creating process efficiency, improving productivity, reducing costs and generally creating economies of scale through notions of bulk optimisation. Whilst there is a growing abundance of papers discussing both the network design and hub operational components mentioned above, there is a shortcoming in the overall analysis when it comes to discussing the “spoke-terminal” of hub-and-spoke freight distribution systems and their capabilities for handling the diverse and discrete customer profiles of freight that multi-user LTL hub-and-spoke networks typically handle over the “last-mile” of the delivery, in particular, a mix of retail and non-retail customers. A simulation study is undertaken to investigate the impact on operational performance when the current combined spoke-terminal delivery tours are separated by ‘profile-type’ (i.e. retail or nonretail). The results indicate that a potential improvement in delivery performance can be made by separating retail and non-retail delivery runs at the spoke-terminal and that dedicated retail and non-retail delivery tours could be adopted in order to improve customer delivery requirements and adapt hub-deployed policies. The study also leverages key operator experiences to highlight the main practical implementation challenges when integrating the observed simulation results into the real-world. The study concludes that DES be harnessed as an enabling device to develop a ‘guide policy’. This policy needs to be flexible and should be applied in stages, taking into account the growing retail-exposure.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents two algorithms for one-parameter local bifurcations of equilibrium points of dynamical systems. The algorithms are implemented in the computer algebra system Maple 13 © and designed as a package. Some examples are reported to demonstrate the package’s facilities.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We use the GN-model to assess Nyquist-WDM 100/200Gbit/s PM-QPSK/16QAM signal reach on low loss, large core area fibre using extended range, variable gain hybrid Raman-EDFAs. 5000/1500km transmission is possible over a wide range of amplifier spans. © OSA 2014.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Metrology processes contribute to entire manufacturing systems that can have a considerable impact on financial investment in coordinate measuring systems. However, there is a lack of generic methodologies to quantify their economical value in today’s industry. To solve this problem, a mathematical model is proposed in this paper by statistical deductive reasoning. This is done through defining the relationships between Process Capability Index, measurement uncertainty and tolerance band. The correctness of the mathematical model is proved by a case study. Finally, several comments and suggestions on evaluating and maximizing the benefits of metrology investment are given.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The use of ex-transportation battery system (i.e. second life EV/HEV batteries) in grid applications is an emerging field of study. A hybrid battery scheme offers a more practical approach in second life battery energy storage systems because battery modules could be from different sources/ vehicle manufacturers depending on the second life supply chain and have different characteristics e.g. voltage levels, maximum capacity and also different levels of degradations. Recent research studies have suggested a dc-side modular multilevel converter topology to integrate these hybrid batteries to a grid-tie inverter. Depending on the battery module characteristics, the dc-side modular converter can adopt different modes such as boost, buck or boost-buck to suitably transfer the power from battery to the grid. These modes have different switching techniques, control range, different efficiencies, which give a system designer choice on operational mode. This paper presents an analysis and comparative study of all the modes of the converter along with their switching performances in detail to understand the relative advantages and disadvantages of each mode to help to select the suitable converter mode. Detailed study of all the converter modes and thorough experimental results based on a multi-modular converter prototype based on hybrid batteries has been presented to validate the analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The paper considers a general model of electoral systems combining district-based elections with a compensatory mechanism in order to implement any outcome between strictly majoritarian and purely proportional seat allocation. It contains vote transfer and allows for the application of three different correction formulas. Analysis in a two-party system shows that a trade-off exists for the dominant party between the expected seat share and the chance of obtaining majority. Vote transfer rules are also investigated by focusing on the possibility of manipulation. The model is applied to the 2014 Hungarian parliamentary election. Hypothetical results reveal that the vote transfer rule cannot be evaluated in itself, only together with the share of constituency seats. With an appropriate choice of the latter, the three mechanisms can be made functionally equivalent.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With advances in science and technology, computing and business intelligence (BI) systems are steadily becoming more complex with an increasing variety of heterogeneous software and hardware components. They are thus becoming progressively more difficult to monitor, manage and maintain. Traditional approaches to system management have largely relied on domain experts through a knowledge acquisition process that translates domain knowledge into operating rules and policies. It is widely acknowledged as a cumbersome, labor intensive, and error prone process, besides being difficult to keep up with the rapidly changing environments. In addition, many traditional business systems deliver primarily pre-defined historic metrics for a long-term strategic or mid-term tactical analysis, and lack the necessary flexibility to support evolving metrics or data collection for real-time operational analysis. There is thus a pressing need for automatic and efficient approaches to monitor and manage complex computing and BI systems. To realize the goal of autonomic management and enable self-management capabilities, we propose to mine system historical log data generated by computing and BI systems, and automatically extract actionable patterns from this data. This dissertation focuses on the development of different data mining techniques to extract actionable patterns from various types of log data in computing and BI systems. Four key problems—Log data categorization and event summarization, Leading indicator identification , Pattern prioritization by exploring the link structures , and Tensor model for three-way log data are studied. Case studies and comprehensive experiments on real application scenarios and datasets are conducted to show the effectiveness of our proposed approaches.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research analyzed the spatial relationship between a mega-scale fracture network and the occurrence of vegetation in an arid region. High-resolution aerial photographs of Arches National Park, Utah were used for digital image processing. Four sets of large-scale joints were digitized from the rectified color photograph in order to characterize the geospatial properties of the fracture network with the aid of a Geographic Information System. An unsupervised landcover classification was carried out to identify the spatial distribution of vegetation on the fractured outcrop. Results of this study confirm that the WNW-ESE alignment of vegetation is dominantly controlled by the spatial distribution of the systematic joint set, which in turn parallels the regional fold axis. This research provides insight into the spatial heterogeneity inherent to fracture networks, as well as the effects of jointing on the distribution of surface vegetation in desert environments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Existing instrumental techniques must be adaptable to the analysis of novel explosives if science is to keep up with the practices of terrorists and criminals. The focus of this work has been the development of analytical techniques for the analysis of two types of novel explosives: ascorbic acid-based propellants, and improvised mixtures of concentrated hydrogen peroxide/fuel. In recent years, the use of these explosives in improvised explosive devices (IEDs) has increased. It is therefore important to develop methods which permit the identification of the nature of the original explosive from post-blast residues. Ascorbic acid-based propellants are low explosives which employ an ascorbic acid fuel source with a nitrate/perchlorate oxidizer. A method which utilized ion chromatography with indirect photometric detection was optimized for the analysis of intact propellants. Post-burn and post-blast residues if these propellants were analyzed. It was determined that the ascorbic acid fuel and nitrate oxidizer could be detected in intact propellants, as well as in the post-burn and post-blast residues. Degradation products of the nitrate and perchlorate oxidizers were also detected. With a quadrupole time-of-flight mass spectrometer (QToFMS), exact mass measurements are possible. When an HPLC instrument is coupled to a QToFMS, the combination of retention time with accurate mass measurements, mass spectral fragmentation information, and isotopic abundance patterns allows for the unequivocal identification of a target analyte. An optimized HPLC-ESI-QToFMS method was applied to the analysis of ascorbic acid-based propellants. Exact mass measurements were collected for the fuel and oxidizer anions, and their degradation products. Ascorbic acid was detected in the intact samples and half of the propellants subjected to open burning; the intact fuel molecule was not detected in any of the post-blast residue. Two methods were optimized for the analysis of trace levels of hydrogen peroxide: HPLC with fluorescence detection (HPLC-FD), and HPLC with electrochemical detection (HPLC-ED). Both techniques were extremely selective for hydrogen peroxide. Both methods were applied to the analysis of post-blast debris from improvised mixtures of concentrated hydrogen peroxide/fuel; hydrogen peroxide was detected on variety of substrates. Hydrogen peroxide was detected in the post-blast residues of the improvised explosives TATP and HMTD.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.