908 resultados para Capability Maturity Model for Software
Resumo:
After decades of mergers and acquisitions and successive technology trends such as CRM, ERP and DW, the data in enterprise systems is scattered and inconsistent. Global organizations face the challenge of addressing local uses of shared business entities, such as customer and material, and at the same time have a consistent, unique, and consolidate view of financial indicators. In addition, current enterprise systems do not accommodate the pace of organizational changes and immense efforts are required to maintain data. When it comes to systems integration, ERPs are considered “closed” and expensive. Data structures are complex and the “out-of-the-box” integration options offered are not based on industry standards. Therefore expensive and time-consuming projects are undertaken in order to have required data flowing according to business processes needs. Master Data Management (MDM) emerges as one discipline focused on ensuring long-term data consistency. Presented as a technology-enabled business discipline, it emphasizes business process and governance to model and maintain the data related to key business entities. There are immense technical and organizational challenges to accomplish the “single version of the truth” MDM mantra. Adding one central repository of master data might prove unfeasible in a few scenarios, thus an incremental approach is recommended, starting from areas most critically affected by data issues. This research aims at understanding the current literature on MDM and contrasting it with views from professionals. The data collected from interviews revealed details on the complexities of data structures and data management practices in global organizations, reinforcing the call for more in-depth research on organizational aspects of MDM. The most difficult piece of master data to manage is the “local” part, the attributes related to the sourcing and storing of materials in one particular warehouse in The Netherlands or a complex set of pricing rules for a subsidiary of a customer in Brazil. From a practical perspective, this research evaluates one MDM solution under development at a Finnish IT solution-provider. By means of applying an existing assessment method, the research attempts at providing the company with one possible tool to evaluate its product from a vendor-agnostics perspective.
Resumo:
Business model in the context of international entrepreneurship is a rather new topic in academic literature. The objective of this thesis is to examine value creation through business models in internationally entrepreneurial firms. The study examines value creation through the two partner interfaces and the customer interface of a company. Central for the study is the consideration of also the partners’ incentives. Business model construct is studied by defining the concept, examining its elements and the relationship with strategy – concluding with value creation through the concept. The international entrepreneurship chapter focuses on internationally entrepreneurial firms, inspecting the drivers behind international entrepreneurship and studying value network concept. Value creation functions as a driving theme in the theory discussion. The empirical research of the study focuses on eight Finnish internationally entrepreneurial software companies. The study is conducted as a qualitative cross-case analysis building on the single case company business model analyses. The findings suggest that the business models of software companies incorporate vast similarities. However, the degree of international experience has influence on the companies’ value creation and the way they organize their activities both in upstream and downstream of the value chain.
Resumo:
Open source and open source software development have been interesting phenomena during the past decade. Traditional business models do not apply with open source, where the actual product is free. However, it is possible to make business with open source, even successfully, but the question is: how? The aim of this study is to find the key factors of successfully making business out of commercial open source software development. The task is achieved by finding the factors that influence open source projects, finding the relation between those factors, and find out why some factors explain the success more than others. The literature review concentrates first on background of open innovation, open source and open source software. Then business models, critical success factors and success measures are examined. Based on existing literature a framework was created. The framework contains categorized success factors that influence software projects in general as well as open source software projects. The main categories of success factors in software business are divided into community management, technology management, project management and market management. In order to find out which of the factors based on the existing literature are the most critical, empirical research was done by conducting unstructured personal interviews. The main finding based on the interviews is that the critical success factors in open source software business do not differ from those in traditional software business or in fact from those in any other business. Some factors in the framework came out in the interviews that can be considered as key factors: establishing and communicating hierarchy (community management), localization (technology management), good license know-how and IPR management (project management), and effective market management (market management). The critical success factors according to the interviewees are not listed in the framework: low price, good product and good business model development.
Resumo:
Corporate decision to scale Agile Software development methodologies in offshoring environment has been obstructed due to possible challenges in scaling agile as agile methodologies are regarded to be suitable for small project and co-located team only. Although model such as Agile Scaling Model (ASM) has been developed for scaling Agile with different factors, inabilities of companies to figure out challenges and addressing them lead to failure of project rather than gaining the benefits of using agile methodologies. This failure can be avoided, when scaling agile in IT offshoring environment, by determining key challenges associated in scaling agile in IT offshoring environment and then preparing strategies for addressing those key challenges. These key challenges in scaling agile with IT offshoring environment can be determined by studying issues related with Offshoring and Agile individually and also considering the positive impact of agile methodology in offshoring environment. Then, possible strategies to tackle these key challenges are developed according to the nature of individual challenges and utilizing the benefits of different agile methodologies to address individual situation. Thus, in this thesis, we proposed strategy of using hybrid agile method, which is increasing trend due to adaptive nature of Agile. Determination of the key challenges and possible strategies for tackling those challenges are supported with the survey conducted in the researched organization.
Model-View-Controller architectural pattern and its evolution in graphical user interface frameworks
Resumo:
Model-View-Controller (MVC) is an architectural pattern used in software development for graphical user interfaces. It was one of the first proposed solutions in the late 1970s to the Smart UI anti-pattern, which refers to the act of writing all domain logic into a user interface. The original MVC pattern has since evolved in multiple directions, with various names and may confuse many. The goal of this thesis is to present the origin of the MVC pattern and how it has changed over time. Software architecture in general and the MVC’s evolution within web applications are not the primary focus. Fundamen- tal designs are abstracted, and then used to examine the more recent versions. Prob- lems with the subject and its terminology are also presented.
Resumo:
The development of carbon capture and storage (CCS) has raised interest towards novel fluidised bed (FB) energy applications. In these applications, limestone can be utilized for S02 and/or CO2 capture. The conditions in the new applications differ from the traditional atmospheric and pressurised circulating fluidised bed (CFB) combustion conditions in which the limestone is successfully used for SO2 capture. In this work, a detailed physical single particle model with a description of the mass and energy transfer inside the particle for limestone was developed. The novelty of this model was to take into account the simultaneous reactions, changing conditions, and the effect of advection. Especially, the capability to study the cyclic behaviour of limestone on both sides of the calcination-carbonation equilibrium curve is important in the novel conditions. The significances of including advection or assuming diffusion control were studied in calcination. Especially, the effect of advection in calcination reaction in the novel combustion atmosphere was shown. The model was tested against experimental data; sulphur capture was studied in a laboratory reactor in different fluidised bed conditions. Different Conversion levels and sulphation patterns were examined in different atmospheres for one limestone type. The Conversion curves were well predicted with the model, and the mechanisms leading to the Conversion patterns were explained with the model simulations. In this work, it was also evaluated whether the transient environment has an effect on the limestone behaviour compared to the averaged conditions and in which conditions the effect is the largest. The difference between the averaged and transient conditions was notable only in the conditions which were close to the calcination-carbonation equilibrium curve. The results of this study suggest that the development of a simplified particle model requires a proper understanding of physical and chemical processes taking place in the particle during the reactions. The results of the study will be required when analysing complex limestone reaction phenomena or when developing the description of limestone behaviour in comprehensive 3D process models. In order to transfer the experimental observations to furnace conditions, the relevant mechanisms that take place need to be understood before the important ones can be selected for 3D process model. This study revealed the sulphur capture behaviour under transient oxy-fuel conditions, which is important when the oxy-fuel CFB process and process model are developed.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Rolling element bearings are essential components of rotating machinery. The spherical roller bearing (SRB) is one variant seeing increasing use, because it is self-aligning and can support high loads. It is becoming increasingly important to understand how the SRB responds dynamically under a variety of conditions. This doctoral dissertation introduces a computationally efficient, three-degree-of-freedom, SRB model that was developed to predict the transient dynamic behaviors of a rotor-SRB system. In the model, bearing forces and deflections were calculated as a function of contact deformation and bearing geometry parameters according to nonlinear Hertzian contact theory. The results reveal how some of the more important parameters; such as diametral clearance, the number of rollers, and osculation number; influence ultimate bearing performance. Distributed defects, such as the waviness of the inner and outer ring, and localized defects, such as inner and outer ring defects, are taken into consideration in the proposed model. Simulation results were verified with results obtained by applying the formula for the spherical roller bearing radial deflection and the commercial bearing analysis software. Following model verification, a numerical simulation was carried out successfully for a full rotor-bearing system to demonstrate the application of this newly developed SRB model in a typical real world analysis. Accuracy of the model was verified by comparing measured to predicted behaviors for equivalent systems.
Resumo:
The objective of this project was to introduce a new software product to pulp industry, a new market for case company. An optimization based scheduling tool has been developed to allow pulp operations to better control their production processes and improve both production efficiency and stability. Both the work here and earlier research indicates that there is a potential for savings around 1-5%. All the supporting data is available today coming from distributed control systems, data historians and other existing sources. The pulp mill model together with the scheduler, allows what-if analyses of the impacts and timely feasibility of various external actions such as planned maintenance of any particular mill operation. The visibility gained from the model proves also to be a real benefit. The aim is to satisfy demand and gain extra profit, while achieving the required customer service level. Research effort has been put both in understanding the minimum features needed to satisfy the scheduling requirements in the industry and the overall existence of the market. A qualitative study was constructed to both identify competitive situation and the requirements vs. gaps on the market. It becomes clear that there is no such system on the marketplace today and also that there is room to improve target market overall process efficiency through such planning tool. This thesis also provides better overall understanding of the different processes in this particular industry for the case company.
Resumo:
This master’s thesis has been done for Drive! –project in which a new electric motor solution for mobile working machines is developed. Generic simulation model will be used as marketing and development tool. It can be used to model a wide variety of different vehicles with and without electric motor and to show customer the difference between traditionally build vehicles and those with new electric motor solution. Customers can also use simulation model to research different solutions for their own vehicles. At the start of the project it was decided that MeVEA software would be used as main simulation program and Simulink will only be used to simulate the operation of electrical components. Development of the generic model started with the research of these two software applications, simulation models which are made with them and how these simulation models can be build faster. Best results were used for building of generic simulation model. Finished generic model can be used to produce new tractor models for real-time simulations in short notice. All information about model is collected to one datasheet which can be easily filled by the user. After datasheet is filled a script will automatically build new simulation model in seconds. At the moment generic model is capable of building simulation models for wide variety of different tractors but it can be easily altered for other vehicle types too which would also benefit greatly from electric drive solution. Those could be for example wheel loaders and harvesters.
Resumo:
In literature CO 2 liquidization is well studied with steady state modeling. Steady state modeling gives an overview of the process but it doesn’t give information about process behavior during transients. In this master’s thesis three dynamic models of CO2 liquidization were made and tested. Models were straight multi-stage compression model and two compression liquid pumping models, one with and one without cold energy recovery. Models were made with Apros software, models were also used to verify that Apros is capable to model phase changes and over critical state of CO 2. Models were verified against compressor manufacturer’s data and simulation results presented in literature. From the models made in this thesis, straight compression model was found to be the most energy efficient and fastest to react to transients. Also Apros was found to be capable tool for dynamic liquidization modeling.
Resumo:
This paper introduces an important source of torque ripple in PMSMs with tooth-coil windings (TC-PMSMs). It is theoretically proven that saturation and cross-saturation phenomena caused by the non-synchronous harmonics of the stator current linkage cause a synchronous inductance variation with a particular periodicity. This, in turn, determines the magnitude of the torque ripple and can also deteriorate the performance of signal-injection-based rotor position estimation algorithms. An improved dq- inductance model is proposed. It can be used in torque ripple reduction control schemes and can enhance the self-sensing capabilities of TC-PMSMs
Resumo:
We developed a forced non-electric-shock running wheel (FNESRW) system that provides rats with high-intensity exercise training using automatic exercise training patterns that are controlled by a microcontroller. The proposed system successfully makes a breakthrough in the traditional motorized running wheel to allow rats to perform high-intensity training and to enable comparisons with the treadmill at the same exercise intensity without any electric shock. A polyvinyl chloride runway with a rough rubber surface was coated on the periphery of the wheel so as to permit automatic acceleration training, and which allowed the rats to run consistently at high speeds (30 m/min for 1 h). An animal ischemic stroke model was used to validate the proposed system. FNESRW, treadmill, control, and sham groups were studied. The FNESRW and treadmill groups underwent 3 weeks of endurance running training. After 3 weeks, the experiments of middle cerebral artery occlusion, the modified neurological severity score (mNSS), an inclined plane test, and triphenyltetrazolium chloride were performed to evaluate the effectiveness of the proposed platform. The proposed platform showed that enhancement of motor function, mNSS, and infarct volumes was significantly stronger in the FNESRW group than the control group (P<0.05) and similar to the treadmill group. The experimental data demonstrated that the proposed platform can be applied to test the benefit of exercise-preconditioning-induced neuroprotection using the animal stroke model. Additional advantages of the FNESRW system include stand-alone capability, independence of subjective human adjustment, and ease of use.
Resumo:
An enterovirus 71 (EV71) vaccine for the prevention of hand, foot, and mouth disease (HMFD) is available, but it is not known whether the EV71 vaccine cross-protects against Coxsackievirus (CV) infection. Furthermore, although an inactivated circulating CVA16 Changchun 024 (CC024) strain vaccine candidate is effective in newborn mice, the CC024 strain causes severe lesions in muscle and lung tissues. Therefore, an effective CV vaccine with improved pathogenic safety is needed. The aim of this study was to evaluate the in vivo safety and in vitro replication capability of a noncirculating CVA16 SHZH05 strain. The replication capacity of circulating CVA16 strains CC024, CC045, CC090 and CC163 and the noncirculating SHZH05 strain was evaluated by cytopathic effect in different cell lines. The replication capacity and pathogenicity of the CC024 and SHZH05 strains were also evaluated in a neonatal mouse model. Histopathological and viral load analyses demonstrated that the SHZH05 strain had an in vitro replication capacity comparable to the four CC strains. The CC024, but not the SHZH05 strain, became distributed in a variety of tissues and caused severe lesions and mortality in neonatal mice. The differences in replication capacity and in vivo pathogenicity of the CC024 and SHZH05 strains may result from differences in the nucleotide and amino acid sequences of viral functional polyproteins P1, P2 and P3. Our findings suggest that the noncirculating SHZH05 strain may be a safer CV vaccine candidate than the CC024 strain.
Resumo:
As increasing efficiency of a wind turbine gearbox, more power can be transferred from rotor blades to generator and less power is used to cause wear and heating in the gearbox. By using a simulation model, behavior of the gearbox can be studied before creating expensive prototypes. The objective of the thesis is to model a wind turbine gearbox and its lubrication system to study power losses and heat transfer inside the gearbox and to study the simulation methods of the used software. Software used to create the simulation model is Siemens LMS Imagine.Lab AMESim, which can be used to create one-dimensional mechatronic system simulation models from different fields of engineering. When combining components from different libraries it is possible to create a simulation model, which includes mechanical, thermal and hydraulic models of the gearbox. Results for mechanical, thermal, and hydraulic simulations are presented in the thesis. Due to the large scale of the wind turbine gearbox and the amount of power transmitted, power loss calculations from AMESim software are inaccurate and power losses are modelled as constant efficiency for each gear mesh. Starting values for simulation in thermal and hydraulic simulations were chosen from test measurements and from empirical study as compact and complex design of gearbox prevents accurate test measurements. In further studies to increase the accuracy of the simulation model, components used for power loss calculations needs to be modified and values for unknown variables are needed to be solved through accurate test measurements.