25 resultados para Graph-based method


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Percarboxylic acids are commonly used as disinfection and bleaching agents in textile, paper, and fine chemical industries. All of these applications are based on the oxidative potential of these compounds. In spite of high interest in these chemicals, they are unstable and explosive chemicals, which increase the risk of synthesis processes and transportation. Therefore, the safety criteria in the production process should be considered. Microreactors represent a technology that efficiently utilizes safety advantages resulting from small scale. Therefore, microreactor technology was used in the synthesis of peracetic acid and performic acid. These percarboxylic acids were produced at different temperatures, residence times and catalyst i.e. sulfuric acid concentrations. Both synthesis reactions seemed to be rather fast because with performic acid equilibrium was reached in 4 min at 313 K and with peracetic acid in 10 min at 343 K. In addition, the experimental results were used to study the kinetics of the formation of performic acid and peracetic acid. The advantages of the microreactors in this study were the efficient temperature control even in very exothermic reaction and good mixing due to the short diffusion distances. Therefore, reaction rates were determined with high accuracy. Three different models were considered in order to estimate the kinetic parameters such as reaction rate constants and activation energies. From these three models, the laminar flow model with radial velocity distribution gave most precise parameters. However, sulfuric acid creates many drawbacks in this synthesis process. Therefore, a ´´greener´´ way to use heterogeneous catalyst in the synthesis of performic acid in microreactor was studied. The cation exchange resin, Dowex 50 Wx8, presented very high activity and a long life time in this reaction. In the presence of this catalyst, the equilibrium was reached in 120 second at 313 K which indicates a rather fast reaction. In addition, the safety advantages of microreactors were investigated in this study. Four different conventional methods were used. Production of peracetic acid was used as a test case, and the safety of one conventional batch process was compared with an on-site continuous microprocess. It was found that the conventional methods for the analysis of process safety might not be reliable and adequate for radically novel technology, such as microreactors. This is understandable because the conventional methods are partly based on experience, which is very limited in connection with totally novel technology. Therefore, one checklist-based method was developed to study the safety of intensified and novel processes at the early stage of process development. The checklist was formulated using the concept of layers of protection for a chemical process. The traditional and three intensified processes of hydrogen peroxide synthesis were selected as test cases. With these real cases, it was shown that several positive and negative effects on safety can be detected in process intensification. The general claim that safety is always improved by process intensification was questioned.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cyber security is one of the main topics that are discussed around the world today. The threat is real, and it is unlikely to diminish. People, business, governments, and even armed forces are networked in a way or another. Thus, the cyber threat is also facing military networking. On the other hand, the concept of Network Centric Warfare sets high requirements for military tactical data communications and security. A challenging networking environment and cyber threats force us to consider new approaches to build security on the military communication systems. The purpose of this thesis is to develop a cyber security architecture for military networks, and to evaluate the designed architecture. The architecture is described as a technical functionality. As a new approach, the thesis introduces Cognitive Networks (CN) which are a theoretical concept to build more intelligent, dynamic and even secure communication networks. The cognitive networks are capable of observe the networking environment, make decisions for optimal performance and adapt its system parameter according to the decisions. As a result, the thesis presents a five-layer cyber security architecture that consists of security elements controlled by a cognitive process. The proposed architecture includes the infrastructure, services and application layers that are managed and controlled by the cognitive and management layers. The architecture defines the tasks of the security elements at a functional level without introducing any new protocols or algorithms. For evaluating two separated method were used. The first method is based on the SABSA framework that uses a layered approach to analyze overall security of an organization. The second method was a scenario based method in which a risk severity level is calculated. The evaluation results show that the proposed architecture fulfills the security requirements at least at a high level. However, the evaluation of the proposed architecture proved to be very challenging. Thus, the evaluation results must be considered very critically. The thesis proves the cognitive networks are a promising approach, and they provide lots of benefits when designing a cyber security architecture for the tactical military networks. However, many implementation problems exist, and several details must be considered and studied during the future work.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In today's logistics environment, there is a tremendous need for accurate cost information and cost allocation. Companies searching for the proper solution often come across with activity-based costing (ABC) or one of its variations which utilizes cost drivers to allocate the costs of activities to cost objects. In order to allocate the costs accurately and reliably, the selection of appropriate cost drivers is essential in order to get the benefits of the costing system. The purpose of this study is to validate the transportation cost drivers of a Finnish wholesaler company and ultimately select the best possible driver alternatives for the company. The use of cost driver combinations as an alternative is also studied. The study is conducted as a part of case company's applied ABC-project using the statistical research as the main research method supported by a theoretical, literature based method. The main research tools featured in the study include simple and multiple regression analyses, which together with the literature and observations based practicality analysis forms the basis for the advanced methods. The results suggest that the most appropriate cost driver alternatives are the delivery drops and internal delivery weight. The possibility of using cost driver combinations is not suggested as their use doesn't provide substantially better results while increasing the measurement costs, complexity and load of use at the same time. The use of internal freight cost drivers is also questionable as the results indicate weakening trend in the cost allocation capabilities towards the end of the period. Therefore more research towards internal freight cost drivers should be conducted before taking them in use.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Escherichia coli K-12 (pEGFPluxABCDEAmp) (E. coli-lux), constitutively emitting bioluminescence (BL), was constructed and its BL emitting properties tested in different growth and killing conditions. The BL emission directly correlated with the number of viable E. coli-lux cells, and when subjected to the antimicrobial agent, the diminishment of the BL signal was linked directly to the number of killed bacterial cells. The method provided a very convenient application, especially when compared to conventional plate counting assays. This novel real-time based method was utilized in both immunological and toxicological assessments. The parameters such as the activation phase, the lytic phase and the capacity of the killing of the serum complement system were specified not only in humans but also in other species. E. coli-lux was also successfully used to study the antimicrobial activities of insect haemolymph. The mechanisms of neutrophil activity, like that of a myeloperoxidase (MPO)-H2O2-halide system, were studied using the E. coli-lux approach. The fundamental role of MPO was challenged, since during the actual killing in described circumstances in phagolysosome the MPO system was inactivated and chlorination halted. The toxicological test system, assessing indoor air total toxicity, particularly suitable for suspected mold damages, was designed based on the E. coli-lux method. Susceptibility to the vast number of various toxins, both pure chemicals and dust samples from the buildings and extracts from molds, were investigated. The E. coli-lux application was found to possess high sensitivity and specificity attributes. Alongside the analysis system, the sampling kit for indoor dust was engineered based on the swipe stick and the container. The combination of practical specimen collector and convenient analysis system provided accurate toxic data from the dust sample within hours. Neutrophils are good indicators of the pathophysiological state of the individual, and they can be utilized as a toxicological probe due to their ability to emit chemiluminescence (CL). Neutrophils can either be used as probe cells, directly exposed to the agent studied, or they can act as indicators of the whole biological system exposed to the agent. Human neutrophils were exposed to the same toxins as tested with the E. coli-lux system and measured as luminol amplified CL emission. The influence of the toxins on the individuals was investigated by exposing rats with moniliniformin, the mycotoxin commonly present in Finnish grains. The activity of the rat neutrophils was found to decrease significantly during the 28 days of exposure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pairs trading is an algorithmic trading strategy that is based on the historical co-movement of two separate assets and trades are executed on the basis of degree of relative mispricing. The purpose of this study is to explore one new and alternative copula-based method for pairs trading. The objective is to find out whether the copula method generates more trading opportunities and higher profits than the more traditional distance and cointegration methods applied extensively in previous empirical studies. Methods are compared by selecting top five pairs from stocks of the large and medium-sized companies in the Finnish stock market. The research period includes years 2006-2015. All the methods are proven to be profitable and the Finnish stock market suitable for pairs trading. However, copula method doesn’t generate more trading opportunities or higher profits than the other methods. It seems that the limitations of the more traditional methods are not too restrictive for this particular sample data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The Baltic Sea is a unique environment that contains unique genetic populations. In order to study these populations on a genetic level basic molecular research is needed. The aim of this thesis was to provide a basic genetic resource for population genomic studies by de novo assembling a transcriptome for the Baltic Sea isopod Idotea balthica. RNA was extracted from a whole single adult male isopod and sequenced using Illumina (125bp PE) RNA-Seq. The reads were preprocessed using FASTQC for quality control, TRIMMOMATIC for trimming, and RCORRECTOR for error correction. The preprocessed reads were then assembled with TRINITY, a de Bruijn graph-based assembler, using different k-mer sizes. The different assemblies were combined and clustered using CD-HIT. The assemblies were evaluated using TRANSRATE for quality and filtering, BUSCO for completeness, and TRANSDECODER for annotation potential. The 25-mer assembly was annotated using PANNZER (protein annotation with z-score) and BLASTX. The 25-mer assembly represents the best first draft assembly since it contains the most information. However, this assembly shows high levels of polymorphism, which currently cannot be differentiated as paralogs or allelic variants. Furthermore, this assembly is incomplete, which could be improved by sampling additional developmental stages.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper analyzes the possibilities of integrating cost information and engineering design. Special emphasis is on finding the potential of using the activity-based costing (ABC) method when formulating cost information for the needs of design engineers. This paper suggests that ABC is more useful than the traditional job order costing, but the negative issue is the fact that ABC models become easily too complicated, i.e. expensive to build and maintain, and difficult to use. For engineering design the most suitable elements of ABC are recognizing activities of the company, constructing acitivity chains, identifying resources, activity and cost drivers, as wellas calculating accurate product costs. ABC systems including numerous cost drivers can become complex. Therefore, a comprehensive ABC based cost information system for the use of design engineers should be considered criticaly. Combining the suitable ideas of ABC with engineering oriented thinking could give competentresults.