905 resultados para Couplings of chaotic dynamical systems
Resumo:
A methodology for formally modeling and analyzing software architecture of mobile agent systems provides a solid basis to develop high quality mobile agent systems, and the methodology is helpful to study other distributed and concurrent systems as well. However, it is a challenge to provide the methodology because of the agent mobility in mobile agent systems.^ The methodology was defined from two essential parts of software architecture: a formalism to define the architectural models and an analysis method to formally verify system properties. The formalism is two-layer Predicate/Transition (PrT) nets extended with dynamic channels, and the analysis method is a hierarchical approach to verify models on different levels. The two-layer modeling formalism smoothly transforms physical models of mobile agent systems into their architectural models. Dynamic channels facilitate the synchronous communication between nets, and they naturally capture the dynamic architecture configuration and agent mobility of mobile agent systems. Component properties are verified based on transformed individual components, system properties are checked in a simplified system model, and interaction properties are analyzed on models composing from involved nets. Based on the formalism and the analysis method, this researcher formally modeled and analyzed a software architecture of mobile agent systems, and designed an architectural model of a medical information processing system based on mobile agents. The model checking tool SPIN was used to verify system properties such as reachability, concurrency and safety of the medical information processing system. ^ From successful modeling and analyzing the software architecture of mobile agent systems, the conclusion is that PrT nets extended with channels are a powerful tool to model mobile agent systems, and the hierarchical analysis method provides a rigorous foundation for the modeling tool. The hierarchical analysis method not only reduces the complexity of the analysis, but also expands the application scope of model checking techniques. The results of formally modeling and analyzing the software architecture of the medical information processing system show that model checking is an effective and an efficient way to verify software architecture. Moreover, this system shows a high level of flexibility, efficiency and low cost of mobile agent technologies. ^
Resumo:
To chronicle demographic movement across African Asian corridors, a variety of molecular (sequence analysis, restriction mapping and denaturing high performance liquid chromatography etc.) and statistical (correspondence analysis, AMOVA, calculation of diversity indices and phylogenetic inference, etc.) techniques were employed to assess the phylogeographic patterns of mtDNA control region and Y chromosomal variation among 14 sub-Saharan, North African and Middle Eastern populations. The patterns of genetic diversity revealed evidence of multiple migrations across several African Asian passageways as well within the African continent itself. The two-part analysis uncovered several interesting results which include the following: (1) a north (Egypt and Middle East Asia) to south (sub-Saharan Africa) partitioning of both mtDNA and Y chromosomal haplogroup diversity, (2) a genetic diversity gradient in sub-Saharan Africa from east to west, (3) evidence in favor of the Levantine Corridor over the Horn of Africa as the major genetic conduit since the Last Glacial Maximum, (4) a substantially higher mtDNA versus Y chromosomal sub-Saharan component in the Middle East collections, (5) a higher representation of East versus West African mtDNA haplotypes in the Arabian Peninsula populations versus no such bias in the Levant groups and lastly, (6) genetic remnants of the Bantu demographic expansion in sub-Saharan Africa. ^
Resumo:
This dissertation develops a process improvement method for service operations based on the Theory of Constraints (TOC), a management philosophy that has been shown to be effective in manufacturing for decreasing WIP and improving throughput. While TOC has enjoyed much attention and success in the manufacturing arena, its application to services in general has been limited. The contribution to industry and knowledge is a method for improving global performance measures based on TOC principles. The method proposed in this dissertation will be tested using discrete event simulation based on the scenario of the service factory of airline turnaround operations. To evaluate the method, a simulation model of aircraft turn operations of a U.S. based carrier was made and validated using actual data from airline operations. The model was then adjusted to reflect an application of the Theory of Constraints for determining how to deploy the scarce resource of ramp workers. The results indicate that, given slight modifications to TOC terminology and the development of a method for constraint identification, the Theory of Constraints can be applied with success to services. Bottlenecks in services must be defined as those processes for which the process rates and amount of work remaining are such that completing the process will not be possible without an increase in the process rate. The bottleneck ratio is used to determine to what degree a process is a constraint. Simulation results also suggest that redefining performance measures to reflect a global business perspective of reducing costs related to specific flights versus the operational local optimum approach of turning all aircraft quickly results in significant savings to the company. Savings to the annual operating costs of the airline were simulated to equal 30% of possible current expenses for misconnecting passengers with a modest increase in utilization of the workers through a more efficient heuristic of deploying them to the highest priority tasks. This dissertation contributes to the literature on service operations by describing a dynamic, adaptive dispatch approach to manage service factory operations similar to airline turnaround operations using the management philosophy of the Theory of Constraints.
Resumo:
Do restaurant managers commonly use performance appraisals and, if so, how frequently anf for what purposes? The authors address these questions and review the restaurant industry in general.
Resumo:
The purpose of this paper is to explore the use of automated inventory management systems (IMS) and identify the stage of technology adoption for restaurants in Aruba. A case study analysis involving twelve members of the Aruba Gastronomic Association was conducted using a qualitative research design to gather information on approaches currently used as well as the reasons and perceptions managers/owners have for using or not using automated systems in their facilities. This is the first study conducted using the Aruba restaurant market. Therefore, the application of two technology adoption models was used to integrate critical factors relevant to the study. Major findings indicated the use of an automated IMS in restaurants is limited, thus underscoring the lack of adoption of technology in this area. The results also indicated that two major reasons that restaurants are not adopting IMS technology are budgetary constraints and service support. This study is imperative for two reasons: (1) the results of this study can be used as a comparison for future IMS adoption, not only for Aruba’s restaurant industry but also for other Caribbean destinations and the U.S., (2) this study also provides insight into the additional training and support help needed in hospitality technology services.
Resumo:
One of the most popular techniques for creating spatialized virtual sounds is based on the use of Head-Related Transfer Functions (HRTFs). HRTFs are signal processing models that represent the modifications undergone by the acoustic signal as it travels from a sound source to each of the listener's eardrums. These modifications are due to the interaction of the acoustic waves with the listener's torso, shoulders, head and pinnae, or outer ears. As such, HRTFs are somewhat different for each listener. For a listener to perceive synthesized 3-D sound cues correctly, the synthesized cues must be similar to the listener's own HRTFs. ^ One can measure individual HRTFs using specialized recording systems, however, these systems are prohibitively expensive and restrict the portability of the 3-D sound system. HRTF-based systems also face several computational challenges. This dissertation presents an alternative method for the synthesis of binaural spatialized sounds. The sound entering the pinna undergoes several reflective, diffractive and resonant phenomena, which determine the HRTF. Using signal processing tools, such as Prony's signal modeling method, an appropriate set of time delays and a resonant frequency were used to approximate the measured Head-Related Impulse Responses (HRIRs). Statistical analysis was used to find out empirical equations describing how the reflections and resonances are determined by the shape and size of the pinna features obtained from 3D images of 15 experimental subjects modeled in the project. These equations were used to yield “Model HRTFs” that can create elevation effects. ^ Listening tests conducted on 10 subjects show that these model HRTFs are 5% more effective than generic HRTFs when it comes to localizing sounds in the frontal plane. The number of reversals (perception of sound source above the horizontal plane when actually it is below the plane and vice versa) was also reduced by 5.7%, showing the perceptual effectiveness of this approach. The model is simple, yet versatile because it relies on easy to measure parameters to create an individualized HRTF. This low-order parameterized model also reduces the computational and storage demands, while maintaining a sufficient number of perceptually relevant spectral cues. ^
Resumo:
The main goal of this dissertation was to study two- and three-nucleon Short Range Correlations (SRCs) in high energy three-body breakup of 3He nucleus in 3He(e, e'NN) N reaction. SRCs are characterized by quantum fluctuations in nuclei during which constituent nucleons partially overlap with each other. ^ A theoretical framework is developed within the Generalized Eikonal Approximation (GEA) which upgrades existing medium-energy methods that are inapplicable for high momentum and energy transfer reactions. High momentum and energy transfer is required to provide sufficient resolution for probing SRCs. GEA is a covariant theory which is formulated through the effective Feynman diagrammatic rules. It allows self-consistent calculation of single and double re-scatterings amplitudes which are present in three-body breakup processes. The calculations were carried out in detail and the analytical result for the differential cross section of 3He(e, e'NN)N reaction was derived in a form applicable for programming and numerical calculations. The corresponding computer code has been developed and the results of computation were compared to the published experimental data, showing satisfactory agreement for a wide range of values of missing momenta. ^ In addition to the high energy approximation this study exploited the exclusive nature of the process under investigation to gain more information about the SRCs. The description of the exclusive 3He( e, e'NN)N reaction has been done using the formalism of the nuclear decay function, which is a practically unexplored quantity and is related to the conventional spectral function through the integration of the phase space of the recoil nucleons. Detailed investigation showed that the decay function clearly exhibits the main features of two- and three-nucleon correlations. Four highly practical types of SRCs in 3He nucleus were discussed in great detail for different orders of the final state re-interactions using the decay function as an unique identifying tool. ^ The overall conclusion in this dissertation suggests that the investigation of the decay function opens up a completely new venue in studies of short range nuclear properties. ^
Resumo:
Trenchless methods have been considered to be a viable solution for pipeline projects in urban areas. Their applicability in pipeline projects is expected to increase with the rapid advancements in technology and emerging concerns regarding social costs related to trenching methods. Selecting appropriate project delivery system (PDS) is a key to the success of trenchless projects. To ensure success of the project, the selected project delivery should be tailored to trenchless project specific characteristics and owner needs, since the effectiveness of project delivery systems differs based on different project characteristics and owners requirements. Since different trenchless methods have specific characteristics such rate of installation, lengths of installation, and accuracy, the same project delivery systems may not be equally effective for different methods. The intent of this paper is to evaluate the appropriateness of different PDS for different trenchless methods. PDS are examined through a structured decision-making process called Fuzzy Delivery System Selection Model (FDSSM). The process of incorporating the impacts of: (a) the characteristics of trenchless projects and (b) owners’ needs in the FDSSM is performed by collecting data using questionnaires deployed to professionals involved in the trenchless industry in order to determine the importance of delivery systems selection attributes for different trenchless methods, and then analyzing this data. The sensitivity of PDS rankings with respect to trenchless methods is considered in order to evaluate whether similar project delivery systems are equally effective in different trenchless methods. The effectiveness of PDS with respect to attributes is defined as follows: a project delivery system is most effective with respect to an attribute (e.g., ability to control growth in costs ) if there is no project delivery system that is more effective than that PDS. The results of this study may assist trenchless project owners to select the appropriate PDS for the trenchless method selected.
Resumo:
The main goal of this dissertation was to study two- and three-nucleon Short Range Correlations (SRCs) in high energy three-body breakup of 3He nucleus in 3He(e, e'NN)N reaction. SRCs are characterized by quantum fluctuations in nuclei during which constituent nucleons partially overlap with each other. A theoretical framework is developed within the Generalized Eikonal Approximation (GEA) which upgrades existing medium-energy methods that are inapplicable for high momentum and energy transfer reactions. High momentum and energy transfer is required to provide sufficient resolution for probing SRCs. GEA is a covariant theory which is formulated through the effective Feynman diagrammatic rules. It allows self-consistent calculation of single and double re-scatterings amplitudes which are present in three-body breakup processes. The calculations were carried out in detail and the analytical result for the differential cross section of 3He(e, e'NN)Nreaction was derived in a form applicable for programming and numerical calculations. The corresponding computer code has been developed and the results of computation were compared to the published experimental data, showing satisfactory agreement for a wide range of values of missing momenta. In addition to the high energy approximation this study exploited the exclusive nature of the process under investigation to gain more information about the SRCs. The description of the exclusive 3He(e, e'NN)N reaction has been done using the formalism of the nuclear decay function, which is a practically unexplored quantity and is related to the conventional spectral function through the integration of the phase space of the recoil nucleons. Detailed investigation showed that the decay function clearly exhibits the main features of two- and three-nucleon correlations. Four highly practical types of SRCs in 3He nucleus were discussed in great detail for different orders of the final state re-interactions using the decay function as an unique identifying tool. The overall conclusion in this dissertation suggests that the investigation of the decay function opens up a completely new venue in studies of short range nuclear properties.
Resumo:
The purpose of this research was to develop a theory of high-energy exclusive electrodisintegration of three-nucleon systems on the example of 3He(e, e'NN)N reaction with knocked-out nucleon in the final state. The scattering amplitudes and differential cross section of the reaction were calculated in details within the Generalized Eikonal Approximation(GEA). The manifestly covariant nature of Feynman diagrams derived in GEA allowed us to preserve both the relativistic dynamics and kinematics of the scattering while identifying the low momentum nuclear part of the amplitude with a nonrelativistic nuclear wave function. Numerical calculations of the residual system's total and relative momentum distribution were performed which show reasonable agreement with available experimental data. The theoretical framework of GEA, which was applied previously only for the case of two-body (deuteron) high energy break up reactions, has been practically implemented and shown to provide a valid description for more complex A = 3 systems.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
The chapter discusses both the complementary factors and contradictions of adopting ERP based systems with enterprise 2.0. ERP is characterized as achieving efficient business performance by enabling a standardized business process design, but at a cost of flexibility in operations. It is claimed that enterprise 2.0 can support flexible business process management and so incorporate informal and less structured interactions. A traditional view however is that efficiency and flexibility objectives are incompatible as they are different business objectives which are pursued separately in different organizational environments. Thus an ERP system with a primary objective of improving efficiency and an enterprise 2.0 system with a primary aim of improving flexibility may represent a contradiction and lead to a high risk of failure if adopted simultaneously. This chapter will use case study analysis to investigate the use of a combination of ERP and enterprise 2.0 in a single enterprise with the aim of improving both efficiency and flexibility in operations. The chapter provides an in-depth analysis of the combination of ERP with enterprise 2.0 based on social-technical information systems management theory. The chapter also provides a summary of the benefits of the combination of ERP systems and enterprise 2.0 and how they could contribute to the development of a new generation of business management that combines both formal and informal mechanisms. For example, the multiple-sites or informal communities of an enterprise could collaborate efficiently with a common platform with a certain level of standardization but also have the flexibility in order to provide an agile reaction to internal and external events.
Resumo:
Direct secretion systems which deliver molecules from one cell to another have huge significance in shaping bacterial communities or in determining the outcome of bacterial associations with eukaryotic organisms. This work examines the roles of the Type III Secretion System (T3SS) and the Type VI Secretion System (T6SS) systems of Pseudomonas, a widespread genus including clinical pathogens and biocontrol strains. Bioinformatic analysis of T6SS phylogeny and associated gene content within Pseudomonas identified several T6SS phylogenetic groups, and linked T6SS components VgrG and Hcp encoded outside of T6SS gene loci with their cognate T6SS phylogenetic groups. Remarkably, such “orphan” vgrG and hcp genes were found to occur in diverse, horizontally transferred, operons often containing putative T6SS accessory components and effectors. The prevalence of a widespread superfamily of T6SS lipase effectors (Tle) was assessed in metagenomes from various environments. The abundance of the Tle superfamily and individual families varied between niches, suggesting there is niche specific selection and specialisation of Tle. Experimental work also discovered that P. fluorescens F113 uses the SPI-1 T3SS to avoid amoeboid grazing in mixed populations. This finding may represent a significant aspect of F113 rhizocompetence, and the rhizocompetence of other Rhizobacteria.