953 resultados para Models and Principles
Resumo:
Abstract—This paper discusses existing military capability models and proposes a comprehensive capability meta-model (CCMM) which unites the existing capability models into an integrated and hierarchical whole. The Zachman Framework for Enterprise Architecture is used as a structure for the CCMM. The CCMM takes into account the abstraction level, the primary area of application, stakeholders, intrinsic process, and life cycle considerations of each existing capability model, and shows how the models relate to each other. The validity of the CCMM was verified through a survey of subject matter experts. The results suggest that the CCMM is of practical value to various capability stakeholders in many ways, such as helping to improve communication between the different capability communities.
Resumo:
Choosing the right supplier is crucial for long-term business prospects and profitability. Thus organizational buyers are naturally very interested in how they can select the right supplier for their needs. Likewise, suppliers are interested in knowing how their customers make purchasing decisions in order to effectively sell and market to them. From the point of view of the textile and clothing (T&C) industry, regulatory changes and increasing low-cost and globalization pressures have led to the rise of low-cost production locations India and China as the world’s largest T&C producers. This thesis will examine T&C trade between Finland and India specifically in the context of non-industrial T&C products. Its main research problem asks: what perceptions do Finnish T&C industry buyers hold of India and Indian suppliers? B2B buyers use various supplier selection models and criteria in making their purchase decisions. A significant amount of research has been done into supplier selection practices, and in the context of international trade, country of origin (COO) perceptions specifically have garnered much attention. This thesis uses a mixed methods approach (online questionnaire and in-depth interviews) to evaluate Finnish T&C buyers’ supplier selection criteria, COO perceptions of India and experiences of Indian suppliers. It was found that the most important supplier selection criteria used by Finnish T&C buyers are quality, reliability and cost. COO perceptions were not found to be influential in purchasing process. Indian T&C suppliers’ strengths were found to be low cost, flexibility and a history of traditional T&C expertise. Their weaknesses include product quality and unreliable delivery times. Overall, the main challenges that need to be overcome by Indian T&C companies are logistical difficulties and the cost vs. quality trade-off. Despite positive perceptions of India for cost, the overall value offered by Indian T&C products was perceived to be low due to poor quality. Unreliable delivery time experiences also affected buyer’s reliability perceptions of Indian suppliers. The main limiting factors of this thesis relate to the small sample size used in the research. This limits the generalizability of results and the ability to evaluate the reliability and validity of some of the research instruments.
Resumo:
Diplomityö tehtiin julkisomisteiseen osakeyhtiöön, joka tuottaa julkiseen terveydenhuoltoon ja kuntien toimintaan liittyviä tieto- ja viestintäteknologian sekä lääketieteellisen tekniikan (ICMT) palveluja. Työn tavoitteena oli rakentaa yritykseen perinteisen kustannuslaskennan ja toimintolaskennan mallit, joita vertaamalla pyrittiin löytämään soveliain kustannuslaskentaratkaisu ICT-alalle. Tätä tavoitetta tuettiin haastattelututkimuksella suomalaisiin ICT-toimialan yrityksiin, heidän käytössään olevista kustannuslaskennan ja hinnoittelun menetelmistä. Teoriaosuudessa esitetään kustannustenlaskenta ja hinnoittelumenetelmiä ja käydään läpi niiden periaatteet sekä soveltuminen ICT-sektorille. Empiirisessä osuudessa kuvataan perinteisen kustannuslaskentamallin rakentaminen sekä toimintolaskentaprojektin eteneminen kohdeyrityksessä. ICT-toimialalle suoritettavan haastattelututkimuksen avulla kartoitetaan suomalaisten kärki ICT-yritysten kustannustenlaskentamenetelmiä ja toimitapoja, sekä käytössä olevia hinnoitteluperiaatteita. Perinteinen kustannuslaskennan menetelmä osoittautui tutkimuksessa soveltuvimmaksi kohdeyrityksen käyttöön. Haastattelututkimuksen tulokset tukivat tätä perinteisen kustannuslaskennan käyttöä. Pääosalla haastatteluun osallistuneista yrityksistä oli käytössään kustannusperustainen laskentajärjestelmä. Yleisin laskentamenetelmä suomalaisissa ICT-alan yhtiöissä oli katetuottolaskenta. Hinnoittelu perustui niin ikään ensisijaisesti tuotekustannusten laskentaan.
Resumo:
State-of-the-art predictions of atmospheric states rely on large-scale numerical models of chaotic systems. This dissertation studies numerical methods for state and parameter estimation in such systems. The motivation comes from weather and climate models and a methodological perspective is adopted. The dissertation comprises three sections: state estimation, parameter estimation and chemical data assimilation with real atmospheric satellite data. In the state estimation part of this dissertation, a new filtering technique based on a combination of ensemble and variational Kalman filtering approaches, is presented, experimented and discussed. This new filter is developed for large-scale Kalman filtering applications. In the parameter estimation part, three different techniques for parameter estimation in chaotic systems are considered. The methods are studied using the parameterized Lorenz 95 system, which is a benchmark model for data assimilation. In addition, a dilemma related to the uniqueness of weather and climate model closure parameters is discussed. In the data-oriented part of this dissertation, data from the Global Ozone Monitoring by Occultation of Stars (GOMOS) satellite instrument are considered and an alternative algorithm to retrieve atmospheric parameters from the measurements is presented. The validation study presents first global comparisons between two unique satellite-borne datasets of vertical profiles of nitrogen trioxide (NO3), retrieved using GOMOS and Stratospheric Aerosol and Gas Experiment III (SAGE III) satellite instruments. The GOMOS NO3 observations are also considered in a chemical state estimation study in order to retrieve stratospheric temperature profiles. The main result of this dissertation is the consideration of likelihood calculations via Kalman filtering outputs. The concept has previously been used together with stochastic differential equations and in time series analysis. In this work, the concept is applied to chaotic dynamical systems and used together with Markov chain Monte Carlo (MCMC) methods for statistical analysis. In particular, this methodology is advocated for use in numerical weather prediction (NWP) and climate model applications. In addition, the concept is shown to be useful in estimating the filter-specific parameters related, e.g., to model error covariance matrix parameters.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The purpose of this study was to explore software development methods and quality assurance practices used by South Korean software industry. Empirical data was collected by conducting a survey that focused on three main parts: software life cycle models and methods, software quality assurance including quality standards, the strengths and weaknesses of South Korean software industry. The results of the completed survey showed that the use of agile methods is slightly surpassing the use of traditional software development methods. The survey also revealed an interesting result that almost half of the South Korean companies do not use any software quality assurance plan in their projects. For the state of South Korean software industry large number of the respondents thought that despite of the weakness, the status of software development in South Korea will improve in the future.
Resumo:
Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.
Resumo:
Linguistic modelling is a rather new branch of mathematics that is still undergoing rapid development. It is closely related to fuzzy set theory and fuzzy logic, but knowledge and experience from other fields of mathematics, as well as other fields of science including linguistics and behavioral sciences, is also necessary to build appropriate mathematical models. This topic has received considerable attention as it provides tools for mathematical representation of the most common means of human communication - natural language. Adding a natural language level to mathematical models can provide an interface between the mathematical representation of the modelled system and the user of the model - one that is sufficiently easy to use and understand, but yet conveys all the information necessary to avoid misinterpretations. It is, however, not a trivial task and the link between the linguistic and computational level of such models has to be established and maintained properly during the whole modelling process. In this thesis, we focus on the relationship between the linguistic and the mathematical level of decision support models. We discuss several important issues concerning the mathematical representation of meaning of linguistic expressions, their transformation into the language of mathematics and the retranslation of mathematical outputs back into natural language. In the first part of the thesis, our view of the linguistic modelling for decision support is presented and the main guidelines for building linguistic models for real-life decision support that are the basis of our modeling methodology are outlined. From the theoretical point of view, the issues of representation of meaning of linguistic terms, computations with these representations and the retranslation process back into the linguistic level (linguistic approximation) are studied in this part of the thesis. We focus on the reasonability of operations with the meanings of linguistic terms, the correspondence of the linguistic and mathematical level of the models and on proper presentation of appropriate outputs. We also discuss several issues concerning the ethical aspects of decision support - particularly the loss of meaning due to the transformation of mathematical outputs into natural language and the issue or responsibility for the final decisions. In the second part several case studies of real-life problems are presented. These provide background and necessary context and motivation for the mathematical results and models presented in this part. A linguistic decision support model for disaster management is presented here – formulated as a fuzzy linear programming problem and a heuristic solution to it is proposed. Uncertainty of outputs, expert knowledge concerning disaster response practice and the necessity of obtaining outputs that are easy to interpret (and available in very short time) are reflected in the design of the model. Saaty’s analytic hierarchy process (AHP) is considered in two case studies - first in the context of the evaluation of works of art, where a weak consistency condition is introduced and an adaptation of AHP for large matrices of preference intensities is presented. The second AHP case-study deals with the fuzzified version of AHP and its use for evaluation purposes – particularly the integration of peer-review into the evaluation of R&D outputs is considered. In the context of HR management, we present a fuzzy rule based evaluation model (academic faculty evaluation is considered) constructed to provide outputs that do not require linguistic approximation and are easily transformed into graphical information. This is achieved by designing a specific form of fuzzy inference. Finally the last case study is from the area of humanities - psychological diagnostics is considered and a linguistic fuzzy model for the interpretation of outputs of multidimensional questionnaires is suggested. The issue of the quality of data in mathematical classification models is also studied here. A modification of the receiver operating characteristics (ROC) method is presented to reflect variable quality of data instances in the validation set during classifier performance assessment. Twelve publications on which the author participated are appended as a third part of this thesis. These summarize the mathematical results and provide a closer insight into the issues of the practicalapplications that are considered in the second part of the thesis.
Resumo:
This article is a transcription of an electronic symposium sponsored by the Brazilian Society of Neuroscience and Behavior (SBNeC). Invited researchers from the European Union, North America and Brazil discussed two issues on anxiety, namely whether panic is a very intense anxiety or something else, and what aspects of clinical anxiety are reproduced by animal models. Concerning the first issue, most participants agreed that generalized anxiety and panic disorder are different on the basis of clinical manifestations, drug response and animal models. Also, underlying brain structures, neurotransmitter modulation and hormonal changes seem to involve important differences. It is also common knowledge that existing animal models generate different types of fear/anxiety. A challenge for future research is to establish a good correlation between animal models and nosological classification.
Resumo:
The report 'Conditions and practices in the commercialisation of innovation in wood industry' has been written as a part of the Wood Academy project. The report analyses the commercialisation conditions and practices of wood industry by utilising product categorisation based on a conceptual schema which combines the aspects of the transfer of the procession of utility and the degree of form/service utility (or value-added) created or provided by the company. Open innovation approaches help to perceive the possible new product and service innovations as well as the new business models and earning logics in the industry. The report also contains brief company cases to demonstrate theory-to-practice and showcase company examples from successful Finnish companies.
Resumo:
Concentrated solar power (CSP) is a renewable energy technology, which could contribute to overcoming global problems related to pollution emissions and increasing energy demand. CSP utilizes solar irradiation, which is a variable source of energy. In order to utilize CSP technology in energy production and reliably operate a solar field including thermal energy storage system, dynamic simulation tools are needed in order to study the dynamics of the solar field, to optimize production and develop control systems. The object of this Master’s Thesis is to compare different concentrated solar power technologies and configure a dynamic solar field model of one selected CSP field design in the dynamic simulation program Apros, owned by VTT and Fortum. The configured model is based on German Novatec Solar’s linear Fresnel reflector design. Solar collector components including dimensions and performance calculation were developed, as well as a simple solar field control system. The preliminary simulation results of two simulation cases under clear sky conditions were good; the desired and stable superheated steam conditions were maintained in both cases, while, as expected, the amount of steam produced was reduced in the case having lower irradiation conditions. As a result of the model development process, it can be concluded, that the configured model is working successfully and that Apros is a very capable and flexible tool for configuring new solar field models and control systems and simulating solar field dynamic behaviour.
Resumo:
Life cycle assessment (LCA) is one of the most established quantitative tools for environmental impact assessment of products. To be able to provide support to environmentally-aware decision makers on environmental impacts of biomass value-chains, the scope of LCA methodology needs to be augmented to cover landuse related environmental impacts. This dissertation focuses on analysing and discussing potential impact assessment methods, conceptual models and environmental indicators that have been proposed to be implemented into the LCA framework for impacts of land use. The applicability of proposed indicators and impact assessment frameworks is tested from practitioners' perspective, especially focusing on forest biomass value chains. The impacts of land use on biodiversity, resource depletion, climate change and other ecosystem services is analysed and discussed and the interplay in between value choices in LCA modelling and the decision-making situations to be supported is critically discussed. It was found out that land use impact indicators are necessary in LCA in highlighting differences in impacts from distinct land use classes. However, many open questions remain on certainty of highlighting actual impacts of land use, especially regarding impacts of managed forest land use on biodiversity and ecosystem services such as water regulation and purification. The climate impact of energy use of boreal stemwood was found to be higher in the short term and lower in the long-term in comparison with fossil fuels that emit identical amount of CO2 in combustion, due to changes implied to forest C stocks. The climate impacts of energy use of boreal stemwood were found to be higher than the previous estimates suggest on forest residues and stumps. The product lifetime was found to have much higher influence on the climate impacts of woodbased value chains than the origin of stemwood either from thinnings or final fellings. Climate neutrality seems to be likely only in the case when almost all the carbon of harvested wood is stored in long-lived wooden products. In the current form, the land use impacts cannot be modelled with a high degree of certainty nor communicated with adequate level of clarity to decision makers. The academia needs to keep on improving the modelling framework, and more importantly, clearly communicate to decision-makers the limited certainty on whether land-use intensive activities can help in meeting the strict mitigation targets we are globally facing.
Resumo:
This thesis considers optimization problems arising in printed circuit board assembly. Especially, the case in which the electronic components of a single circuit board are placed using a single placement machine is studied. Although there is a large number of different placement machines, the use of collect-and-place -type gantry machines is discussed because of their flexibility and increasing popularity in the industry. Instead of solving the entire control optimization problem of a collect-andplace machine with a single application, the problem is divided into multiple subproblems because of its hard combinatorial nature. This dividing technique is called hierarchical decomposition. All the subproblems of the one PCB - one machine -context are described, classified and reviewed. The derived subproblems are then either solved with exact methods or new heuristic algorithms are developed and applied. The exact methods include, for example, a greedy algorithm and a solution based on dynamic programming. Some of the proposed heuristics contain constructive parts while others utilize local search or are based on frequency calculations. For the heuristics, it is made sure with comprehensive experimental tests that they are applicable and feasible. A number of quality functions will be proposed for evaluation and applied to the subproblems. In the experimental tests, artificially generated data from Markov-models and data from real-world PCB production are used. The thesis consists of an introduction and of five publications where the developed and used solution methods are described in their full detail. For all the problems stated in this thesis, the methods proposed are efficient enough to be used in the PCB assembly production in practice and are readily applicable in the PCB manufacturing industry.
Resumo:
Mitosis is under the stringent quality control of the spindle assembly checkpoint (SAC). However, in cancer cells this control can fail, leading to excessive cellular proliferation and ultimately to the formation of a tumor. Novel cancer cell selective therapies are needed to stop the uncontrolled cell proliferation and tumor growth. The aim of the research presented in this thesis was to identify microRNAs (miRNAs) that could play a role in cancer cell proliferation as well as low molecular weight (LMW) compounds that could interfere with cell division. The findings could be used to develop better cancer diagnostics and therapies in the future. First, a high-throughput screen (HTS) was performed to identify LMW compounds that possess a similar chemical interaction field as rigosertib, an anti-cancer compound undergoing clinical trials. A compound termed Centmitor-1 was discovered that phenocopied the cellular impact of rigosertib by affecting the microtubule dynamics. Next, another HTS aimed at identifying compounds that would target the Hec1 protein, which mediates the interaction between spindle microtubules and chromosomes. Perturbation of this connection should prevent cell division and induce cell death. A compound termed VTT-006 was discovered that abrogated mitosis in several cell line models and exhibited binding to Hec1 in vitro. Lastly, using a cell-based HTS two miRNAs were identified that affected cancer cell proliferation via Aurora B kinase, which is an important mitotic regulator. MiR-378a-5p was found to indirectly suppress the production of the kinase whereas let-7b showed direct binding to the 3’UTR of Aurora B mRNA and repressed its translation. The miRNA-mediated perturbation of Aurora B induced defects in mitosis leading to abnormal chromosome segregation and induction of aneuploidy. The results of this thesis provide new information on miRNA signaling in cancer, which could be utilized for diagnostic purposes. Moreover, the thesis introduces two small compounds that may benefit future drug research.