908 resultados para Capability Maturity Model for Software


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we discuss the main privacy issues around mobile business models and we envision new solutions having privacy protection as a main value proposition. We construct a framework to help analyze the situation and assume that a third party is necessary to warrant transactions between mobile users and m-commerce providers. We then use the business model canvas to describe a generic business model pattern for privacy third party services. This pattern is then illustrated in two different variations of a privacy business model, which we call privacy broker and privacy management software. We conclude by giving examples for each business model and by suggesting further directions of investigation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työssä tutkittiin tehokasta tietojohtamista globaalin metsäteollisuusyrityksen tutkimus ja kehitys verkostossa. Työn tavoitteena oli rakentaa kuvaus tutkimus ja kehitys sisällön hallintaan kohdeyrityksen käyttämän tietojohtamisohjelmiston avulla. Ensin selvitettiin käsitteitä tietämys ja tietojohtaminen kirjallisuuden avulla. Selvityksen perusteella esitettiin prosessimalli, jolla tietämystä voidaan tehokkaasti hallita yrityksessä. Seuraavaksi analysoitiin tietojohtamisen asettamia vaatimuksia informaatioteknologialle ja informaatioteknologian roolia prosessimallissa. Verkoston vaatimukset tietojohtamista kohtaan selvitettiin haastattelemalla yrityksen avainhenkilöitä. Haastatteluiden perusteella järjestelmän tuli tehokkaasti tukea virtuaalisten projektiryhmien työskentelyä, mahdollistaa tehtaiden välinen tietämyksen jakaminen ja tukea järjestelmään syötetyn sisällön hallintaa. Ensiksi järjestelmän käyttöliittymän rakenne ja salaukset muokattiin vastaamaan verkoston tarpeita. Rakenne tarjoaa työalueen työryhmille ja alueet tehtaiden väliseen tietämyksen jakamiseen. Sisällönhallintaa varten järjestelmään kehitettiin kategoria, profiloitu portaali ja valmiiksi määriteltyjä hakuja. Kehitetty malli tehostaa projektiryhmien työskentelyä, mahdollistaa olemassa olevan tietämyksen hyväksikäytön tehdastasolla sekä helpottaa tutkimus ja kehitys aktiviteettien seurantaa. Toimenpide-ehdotuksina esitetään järjestelmän integrointia tehtaiden operatiivisiin ohjausjärjestelmiin ja ohjelmiston käyttöönottoa tehdastason projektinhallinta työkaluksi.Ehdotusten tavoitteena on varmistaa sekä tehokas tietämyksen jakaminen tehtaiden välillä että tehokas tietojohtaminen tehdastasolla.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vaatimustenkäsittely on erittäin tärkeä osa-alue tehtäessä uusia ohjelmistoja. Vaatimustenkäsittely ei ole vain vaatimusmääritettydokumentin kokoamista ohjelmistoprojektin alussa vaan siihen sisältyy vaatimusten määrittely, hallinta ja todentaminen koko ohjelmiston elinkaaren ajan. Ohjelmistopalveluyrityksessä vaatimustenkäsittelyn merkitys korostuu entisestään ja tällaisella yrityksellä on oltava toimiva vaatimustenkäsittelyprosessi. Tässä työssä esitellään vaatimustenkäsittelyn teoriaa, prosesseihin liittyvää laadunvalvontaa sekä prosessien arviointi ja -kehittämismalleja. Työssä tarkastellaan kahden erityyppisen ohjelmistopalveluyrityksen vaatimustenkäsittelyä ja esitetään havaintoja prosessimalleista. Työn tuloksena esitetään johtopäätöksiä vaatimustenkäsittelystä ja siihen liittyvistä prosesseista sekä laadunvalvonnasta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: Statistical shape and appearance models play an important role in reducing the segmentation processing time of a vertebra and in improving results for 3D model development. Here, we describe the different steps in generating a statistical shape model (SSM) of the second cervical vertebra (C2) and provide the shape model for general use by the scientific community. The main difficulties in its construction are the morphological complexity of the C2 and its variability in the population. METHODS: The input dataset is composed of manually segmented anonymized patient computerized tomography (CT) scans. The alignment of the different datasets is done with the procrustes alignment on surface models, and then, the registration is cast as a model-fitting problem using a Gaussian process. A principal component analysis (PCA)-based model is generated which includes the variability of the C2. RESULTS: The SSM was generated using 92 CT scans. The resulting SSM was evaluated for specificity, compactness and generalization ability. The SSM of the C2 is freely available to the scientific community in Slicer (an open source software for image analysis and scientific visualization) with a module created to visualize the SSM using Statismo, a framework for statistical shape modeling. CONCLUSION: The SSM of the vertebra allows the shape variability of the C2 to be represented. Moreover, the SSM will enable semi-automatic segmentation and 3D model generation of the vertebra, which would greatly benefit surgery planning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this thesis is to provide a business model framework that connects customer value to firm resources and explains the change logic of the business model. Strategic supply management and especially dynamic value network management as its scope, the dissertation is based on basic economic theories, transaction cost economics and the resource-based view. The main research question is how the changing customer values should be taken into account when planning business in a networked environment. The main question is divided into questions that form the basic research problems for the separate case studies presented in the five Publications. This research adopts the case study strategy, and the constructive research approach within it. The material consists of data from several Delphi panels and expert workshops, software pilot documents, company financial statements and information on investor relations on the companies’ web sites. The cases used in this study are a mobile multi-player game value network, smart phone and “Skype mobile” services, the business models of AOL, eBay, Google, Amazon and a telecom operator, a virtual city portal business system and a multi-play offering. The main contribution of this dissertation is bridging the gap between firm resources and customer value. This has been done by theorizing the business model concept and connecting it to both the resource-based view and customer value. This thesis contributes to the resource-based view, which deals with customer value and firm resources needed to deliver the value but has a gap in explaining how the customer value changes should be connected to the changes in key resources. This dissertation also provides tools and processes for analyzing the customer value preferences of ICT services, constructing and analyzing business models and business concept innovation and conducting resource analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrated in a wide research assessing destabilizing and triggering factors to model cliff dynamic along the Dieppe's shoreline in High Normandy, this study aims at testing boat-based mobile LiDAR capabilities by scanning 3D point clouds of the unstable coastal cliffs. Two acquisition campaigns were performed in September 2012 and September 2013, scanning (1) a 30-km-long shoreline and (2) the same test cliffs in different environmental conditions and device settings. The potentials of collected data for 3D modelling, change detection and landslide monitoring were afterward assessed. By scanning during favourable meteorological and marine conditions and close to the coast, mobile LiDAR devices are able to quickly scan a long shoreline with median point spacing up to 10cm. The acquired data are then sufficiently detailed to map geomorphological features smaller than 0.5m2. Furthermore, our capability to detect rockfalls and erosion deposits (>m3) is confirmed, since using the classical approach of computing differences between sequential acquisitions reveals many cliff collapses between Pourville and Quiberville and only sparse changes between Dieppe and Belleville-sur-Mer. These different change rates result from different rockfall susceptibilities. Finally, we also confirmed the capability of the boat-based mobile LiDAR technique to monitor single large changes, characterizing the Dieppe landslide geometry with two main active scarps, retrogression up to 40m and about 100,000m3 of eroded materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Peer-reviewed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation and verification operations encounter various challenges in product development process. Requirements for increasing the development cycle pace set new requests for component development process. Verification and validation usually represent the largest activities, up to 40 50 % of R&D resources utilized. This research studies validation and verification as part of case company's component development process. The target is to define framework that can be used in improvement of the validation and verification capability evaluation and development in display module development projects. Validation and verification definition and background is studied in this research. Additionally, theories such as project management, system, organisational learning and causality is studied. Framework and key findings of this research are presented. Feedback system according of the framework is defined and implemented to the case company. This research is divided to the theory and empirical parts. Theory part is conducted in literature review. Empirical part is done in case study. Constructive methode and design research methode are used in this research A framework for capability evaluation and development was defined and developed as result of this research. Key findings of this study were that double loop learning approach with validation and verification V+ model enables defining a feedback reporting solution. Additional results, some minor changes in validation and verification process were proposed. There are a few concerns expressed on the results on validity and reliability of this study. The most important one was the selected research method and the selected model itself. The final state can be normative, the researcher may set study results before the actual study and in the initial state, the researcher may describe expectations for the study. Finally reliability of this study, and validity of this work are studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The size and complexity of projects in the software development are growing very fast. At the same time, the proportion of successful projects is still quite low according to the previous research. Although almost every project's team knows main areas of responsibility which would help to finish project on time and on budget, this knowledge is rarely used in practice. So it is important to evaluate the success of existing software development projects and to suggest a method for evaluating success chances which can be used in the software development projects. The main aim of this study is to evaluate the success of projects in the selected geographical region (Russia-Ukraine-Belarus). The second aim is to compare existing models of success prediction and to determine their strengths and weaknesses. Research was done as an empirical study. A survey with structured forms and theme-based interviews were used as the data collection methods. The information gathering was done in two stages. At the first stage, project manager or someone with similar responsibilities answered the questions over Internet. At the second stage, the participant was interviewed; his or her answers were discussed and refined. It made possible to get accurate information about each project and to avoid errors. It was found out that there are many problems in the software development projects. These problems are widely known and were discussed in literature many times. The research showed that most of the projects have problems with schedule, requirements, architecture, quality, and budget. Comparison of two models of success prediction presented that The Standish Group overestimates problems in project. At the same time, McConnell's model can help to identify problems in time and avoid troubles in future. A framework for evaluating success chances in distributed projects was suggested. The framework is similar to The Standish Group model but it was customized for distributed projects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today cloud computing is the next stage in development information-oriented society in field of information technologies. Great attention is paid to cloud computing in general, but the lack of scientific consideration to components brings to the problem, that not all aspects are well examined. This thesis is an attempt to consider Platform as a Service (a technology of providing development environment through the Internet) from divergent angles. Technical characteristics, costs, time, estimation of effectiveness, risks, strategies that can be applied, migration process, advantages and disadvantages and the future of the approach are examined to get the overall picture of cloud platforms. During the work literature study was used to examine Platform as a Service, characteristics of existent cloud platforms were explored and a model of a typical software development company was developed to create a scenario of migration to cloud technologies. The research showed that besides all virtues in reducing costs and time, cloud platforms have some significant obstacles in adoption. Privacy, security and insufficient legislation impede the concept to be widespread.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study the BEST7 software was employed to quantify different classes of functional groups and to model the proton titration behavior of humic substances. To illustrate the process, the Suwannee River fulvic acid of the IHSS (International Humic Substances Society) was used. Five categories - two classes of phenolic groups (phenol and cathecol), two classes of carboxylic groups (benzoic and phtalic) and the combination between them (salicylic) - of oxygenated groups were considered as being responsible for the potentiometric behavior of the sample and were quantitatively determined. The most and the least abundant groups were cathecol (3.300 ± 0.010 mmol g-1) and phenol (1.225 ± 0.070 mmol g-1), respectively. The estimated equilibrium constants were also determined and were in good agreement with the literature values for phenol and cathecol groups and for benzoic, phtalic and salicylic acids. Distribution diagrams of the species were generated with the software SPE and SPEPLOT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACTA model to estimate yield loss caused by Asian soybean rust (ASR) (Phakopsora pachyrhizi) was developed by collecting data from field experiments during the growing seasons 2009/10 and 2010/11, in Passo Fundo, RS. The disease intensity gradient, evaluated in the phenological stages R5.3, R5.4 and R5.5 based on leaflet incidence (LI) and number of uredinium and lesions/cm2, was generated by applying azoxystrobin 60 g a.i/ha + cyproconazole 24 g a.i/ha + 0.5% of the adjuvant Nimbus. The first application occurred when LI = 25% and the remaining ones at 10, 15, 20 and 25-day intervals. Harvest occurred at physiological maturity and was followed by grain drying and cleaning. Regression analysis between the grain yield and the disease intensity assessment criteria generated 56 linear equations of the yield loss function. The greatest loss was observed in the earliest growth stage, and yield loss coefficients ranged from 3.41 to 9.02 kg/ha for each 1% LI for leaflet incidence, from 13.34 to 127.4 kg/ha/1 lesion/cm2 for lesion density and from 5.53 to 110.0 kg/ha/1 uredinium/cm2 for uredinium density.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the thesis was to create three tutorials for MeVEA Simulation Software to instruct the new users to the modeling methodology used in the MeVEA Simulation Software. MeVEA Simulation Software is a real-time simulation software based on multibody dynamics. The simulation software is designed to create simulation models of complete mechatronical system. The thesis begins with a more detail description of the MeVEA Simulation Software and its components. The thesis presents the three simulation models and written theory of the steps of model creation. The first tutorial introduces the basic features which are used in most simulation models. The basic features include bodies, constrains, forces, basic hydraulics and motors. The second tutorial introduces the power transmission components, tyres and user input definitions for the different components in power transmission systems. The third tutorial introduces the definitions of two different types of collisions and collision graphics used in MeVEA Simulation Software.