993 resultados para Software product lines
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
The Corporate world is becoming more and more competitive. This leads organisations to adapt to this reality, by adopting more efficient processes, which result in a decrease in cost as well as an increase of product quality. One of these processes consists in making proposals to clients, which necessarily include a cost estimation of the project. This estimation is the main focus of this project. In particular, one of the goals is to evaluate which estimation models fit the Altran Portugal software factory the most, the organization where the fieldwork of this thesis will be carried out. There is no broad agreement about which is the type of estimation model more suitable to be used in software projects. Concerning contexts where there is plenty of objective information available to be used as input to an estimation model, model-based methods usually yield better results than the expert judgment. However, what happens more frequently is not having this volume and quality of information, which has a negative impact in the model-based methods performance, favouring the usage of expert judgement. In practice, most organisations use expert judgment, making themselves dependent on the expert. A common problem found is that the performance of the expert’s estimation depends on his previous experience with identical projects. This means that when new types of projects arrive, the estimation will have an unpredictable accuracy. Moreover, different experts will make different estimates, based on their individual experience. As a result, the company will not directly attain a continuous growing knowledge about how the estimate should be carried. Estimation models depend on the input information collected from previous projects, the size of the project database and the resources available. Altran currently does not store the input information from previous projects in a systematic way. It has a small project database and a team of experts. Our work is targeted to companies that operate in similar contexts. We start by gathering information from the organisation in order to identify which estimation approaches can be applied considering the organization’s context. A gap analysis is used to understand what type of information the company would have to collect so that other approaches would become available. Based on our assessment, in our opinion, expert judgment is the most adequate approach for Altran Portugal, in the current context. We analysed past development and evolution projects from Altran Portugal and assessed their estimates. This resulted in the identification of common estimation deviations, errors, and patterns, which lead to the proposal of metrics to help estimators produce estimates leveraging past projects quantitative and qualitative information in a convenient way. This dissertation aims to contribute to more realistic estimates, by identifying shortcomings in the current estimation process and supporting the self-improvement of the process, by gathering as much relevant information as possible from each finished project.
Resumo:
Due to the progresses made in the branch of embedded technologies, manufacturers are becoming able to pack their shop floor level manufacturing resources with even more complex functionalities. This technological progression is radically changing the way production systems are designed and deployed, as well as, monitored and controlled. The dissemination of smart devices inside production processes confers new visibility on the production system while enabling for a more efficient and effective management of the operations. By turning the current manufacturing resources functionalities into services based on a Service Oriented Architecture (SOA), in order to expose them as a service to the user, the binomial manufacturing resource/service will push the entire manufacturing enterprise visibility to another level while enabling the global optimization of the operations and processes of a production system while, at the same time, supporting its accommodation to the operational spike easily and with reduced impact on production. The present work implements a Cloud Manufacturing infrastructure for achieving the resource/service value-added i.e. to facilitate the creation of services that are the composition of currently available atomic services. In this context, manufacturing resource virtualization (i.e. formalization of resources capabilities into services accessible inside and outside the enterprise) and semantic representation/description are the pillars for achieving resource service composition. In conclusion, the present work aims to act on the manufacturing resource layer where physical resources and shop floor capabilities are going to be provided to the user as a SaaS (Software as a Service) and/or IaaS (Infrastructure as a Service).
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
Debido al gran número de transistores por mm2 que hoy en día podemos encontrar en las GPU convencionales, en los últimos años éstas se vienen utilizando para propósitos generales gracias a que ofrecen un mayor rendimiento para computación paralela. Este proyecto implementa el producto sparse matrix-vector sobre OpenCL. En los primeros capítulos hacemos una revisión de la base teórica necesaria para comprender el problema. Después veremos los fundamentos de OpenCL y del hardware sobre el que se ejecutarán las librerías desarrolladas. En el siguiente capítulo seguiremos con una descripción del código de los kernels y de su flujo de datos. Finalmente, el software es evaluado basándose en comparativas con la CPU.
Resumo:
MATE (Monitoring, Analysis and Tuning Environment) es un proyecto que surge en 2004 como tesis doctoral de Anna Sikora con el propósito de investigar la mejora de rendimiento de aplicaciones paralelas a través de la modificación dinámica. Nuestro proyecto supone un paso adelante en cuestiones de calidad de software y pretende dotar al proyecto MATE de una base de desarrollo sólida de cara a futuras lineas de trabajo. Para ello se hace frente a la problemática desde tres perspectivas: la creación de una metodología de desarrollo (y su aplicación sobre el proyecto existente), la implantación de un entorno de desarrollo de soporte y el desarrollo de nuevas características para favorecer la portabilidad y la usabilidad, entre otros aspectos.
Resumo:
Usability is critical to consider an interactive software system successful. Usability testing and evaluation during product development have gained wide acceptance as a strategy to improve product quality. Early introduction of usability perspectives in a product is very important in order to provide a clear visibility of the quality aspects not only for the developers, but also for the testing users as well. However, usability evaluation and testing are not commonly taken into consideration as an essential element of the software development process. Then, this paper exposes a proposal to introduce usability evaluation and testing within a software development through reuse of software artifacts. Additionally, it suggests the introduction of an auditor within the classification of actors for usability tests. It also proposes an improvement of checklists used for heuristics evaluation, adding quantitative and qualitative aspects to them
Resumo:
This paper investigates the link between brand performance and cultural primes in high-risk,innovation-based sectors. In theory section, we propose that the level of cultural uncertaintyavoidance embedded in a firm determine its marketing creativity by increasing the complexityand the broadness of a brand. It determines also the rate of firm product innovations.Marketing creativity and product innovation influence finally the firm marketingperformance. Empirically, we study trademarked promotion in the Software Security Industry(SSI). Our sample consists of 87 firms that are active in SSI from 11 countries in the period1993-2000. We use the data coming from SSI-related trademarks registered by these firms,ending up with 2,911 SSI-related trademarks and a panel of 18,213 observations. We estimatea two stage model in which first we predict the complexity and the broadness of a trademarkas a measure of marketing creativity and the rate of product innovations. Among severalcontrol variables, our variable of theoretical interest is the Hofstede s uncertainty avoidancecultural index. Then, we estimate the trademark duration with a hazard model using thepredicted complexity and broadness as well as the rate of product innovations, along with thesame control variables. Our evidence confirms that the cultural avoidance affects the durationof the trademarks through the firm marketing creativity and product innovation.
Resumo:
Työn päätavoitteena oli tuoda esiin tärkeimmät julkistamisprosessin tehokkuuteen vaikuttavat tekijät. Tutkimuksessa tarkasteltiin aihetta julkistamisprojektien vetäjän näkökulmasta. Kirjallinen selvitys kattaa keskeisimmät ohjelmistoprosessin, palvelun laadun sekä projektihallinnan teoriat. Kokeellisena aineistona käytettiin asiakkailta ja myynnin sekä käyttöönoton organisaatioilta tullutta palautetta ja asiantuntijahaastatteluita. Case-tuotteena tarkasteltiin suuren kansainvälisen yrityksen jälleenmyymää leikkaussalihallinnan ohjelmistoa. Tärkeimpiä julkistamisprosessin tehokkuuteen vaikuttavia tekijöitä ovat tiekartan ja julkistamispakettien sisällön hallinta, projektin aikataulujen pitäminen, rehellinen ja nopea kommunikaatio myyntikanavaan ja asiakkaille, sekä hyvin toteutettu testaus. Työssä käydään läpi esimerkkistrategioita kehittymiseen näillä alueilla.
Resumo:
Monimutkaisen tietokonejärjestelmän suorituskykyoptimointi edellyttää järjestelmän ajonaikaisen käyttäytymisen ymmärtämistä. Ohjelmiston koon ja monimutkaisuuden kasvun myötä suorituskykyoptimointi tulee yhä tärkeämmäksi osaksi tuotekehitysprosessia. Tehokkaampien prosessorien käytön myötä myös energiankulutus ja lämmöntuotto ovat nousseet yhä suuremmiksi ongelmiksi, erityisesti pienissä, kannettavissa laitteissa. Lämpö- ja energiaongelmien rajoittamiseksi on kehitetty suorituskyvyn skaalausmenetelmiä, jotka edelleen lisäävät järjestelmän kompleksisuutta ja suorituskykyoptimoinnin tarvetta. Tässä työssä kehitettiin visualisointi- ja analysointityökalu ajonaikaisen käyttäytymisen ymmärtämisen helpottamiseksi. Lisäksi kehitettiin suorituskyvyn mitta, joka mahdollistaa erilaisten skaalausmenetelmien vertailun ja arvioimisen suoritusympäristöstä riippumatta, perustuen joko suoritustallenteen tai teoreettiseen analyysiin. Työkalu esittää ajonaikaisesti kerätyn tallenteen helposti ymmärrettävällä tavalla. Se näyttää mm. prosessit, prosessorikuorman, skaalausmenetelmien toiminnan sekä energiankulutuksen kolmiulotteista grafiikkaa käyttäen. Työkalu tuottaa myös käyttäjän valitsemasta osasta suorituskuvaa numeerista tietoa, joka sisältää useita oleellisia suorituskykyarvoja ja tilastotietoa. Työkalun sovellettavuutta tarkasteltiin todellisesta laitteesta saatua suoritustallennetta sekä suorituskyvyn skaalauksen simulointia analysoimalla. Skaalausmekanismin parametrien vaikutus simuloidun laitteen suorituskykyyn analysoitiin.
Resumo:
The nature of client-server architecture implies that some modules are delivered to customers. These publicly distributed commercial software components are under risk, because users (and simultaneously potential malefactors) have physical access to some components of the distributed system. The problem becomes even worse if interpreted programming languages are used for creation of client side modules. The language Java, which was designed to be compiled into platform independent byte-code is not an exception and runs the additional risk. Along with advantages like verifying the code before execution (to ensure that program does not produce some illegal operations)Java has some disadvantages. On a stage of byte-code a java program still contains comments, line numbers and some other instructions, which can be used for reverse-engineering. This Master's thesis focuses on protection of Java code based client-server applications. I present a mixture of methods to protect software from tortious acts. Then I shall realize all the theoretical assumptions in a practice and examine their efficiency in examples of Java code. One of the criteria's to evaluate the system is that my product is used for specialized area of interactive television.
Resumo:
Prosessisimulointiohjelmistojen käyttö on yleistynyt paperiteollisuuden prosessien kartoituksessa ja kyseiset ohjelmistot ovat jo pitkään olleet myös Pöyry Engineering Oy:n työkaluja prosessisuunnittelussa. Tämän työn tavoitteeksi määritettiin prosessisimulointiohjelmistojen käytön selvittäminen suomalaisissa paperitehtaissa sekä prosessisimuloinnin tulevaisuuden näkymien arviointi metsäteollisuuden suunnittelupalveluissa liiketoiminnan kehittämiseksi. Työn teoriaosassa selvitetään mm. seuraavia asioita: mitä prosessisimulointi on, miksi simuloidaan ja mitkä ovat simuloinnin hyödyt ja haasteet. Teoriaosassa esitellään yleisimmät käytössä olevat prosessisimulointiohjelmistot, simulointiprosessin eteneminen sekä prosessisimuloinnin tuotteistamisen vaatimuksia. Työn kokeellisessa osassa selvitettiin kyselyn avulla prosessisimulointiohjelmistojen käyttöä Suomen paperitehtaissa. Kysely lähetettiin kaikille Suomen tärkeimmille paperitehtaille. Kyselyn avulla selvitettiin mm, mitä ohjelmia käytetään, mitä on simuloitu, mitä pitää vielä simuloida ja kuinka tarpeellisena prosessisimulointia pidetään. Työntulokset osoittavat, että kaikilla kyselyyn vastanneilla suomalaisilla paperitehtailla on käytetty prosessisimulointia. Suurin osa simuloinneista on tehty konelinjoihin sekä massa- ja vesijärjestelmiin. Tulevaisuuden tärkeimpänä kohteena pidetään energiavirtojen simulointia. Simulointimallien pitkäjänteisessä hyödyntämisessä ja ylläpidossa on kehitettävää, jossa simulointipalvelujen hankkiminen palveluna on tehtaille todennäköisin vaihtoehto. Johtopäätöksenä on se, että tehtailla on tarvetta prosessisimuloinnille. Ilmapiiri on kyselytuloksien mukaan suotuisa ja simulointi nähdään tarpeellisena työkaluna. Prosessisimuloinnin markkinointia, erillispalvelutuotteen lisäksi, kannattaisi kehittää siten, että simulointimallin ylläpito jatkuisi projektin jälkeen lähipalveluna. Markkinointi pitäisi tehdä jo projektin alkuvaiheessa tai projektin aikana. Simulointiohjelmien kirjosta suunnittelutoimiston kannattaa valita simulointiohjelmistoja, jotka sopivat sille parhaiten. Erityistapauksissa muiden ohjelmien hankintaa kannattaa harkita asiakkaan toivomusten mukaisesti.
Resumo:
Tutkimuksen tavoitteena oli määrittää etuja, joita huolellinen immateriaali-oikeussalkun hallinnointi ohjelmistoalalla luo yritykselle. Tutkimusaineisto on kerätty haastattelemalla eri asemissa olevia ihmisiä kolmesta suomalaisesta ohjelmistoalan tuote- ja palveluyrityksestä. Tutkimuksesta käy ilmi, että ohjelmistoyritysten immateriaalioikeussalkut koostuvat liikesalaisuuksista, tekijänoikeudesta, tavaramerkeistä, verkkotunnuksista ja muutamista patenteista. Kiinnostus patentteihin ohjelmistoalalla on kasvanut erityisesti niiden tuoman tekijänoikeutta vahvemman suojan takia. Tällä hetkellä Euroopassa suhtautuminen ohjelmistopatentteihin on kuitenkin vielä käymistilassa. Jos ohjelmistopatentit hyväksytään, immateriaalioikeussalkun strateginen merkitys kasvaa. Tällöin salkunn hallinnointi tukee yrityksen tavoitteita - esimerkiksi oman toimintavapauden turvaamista - avustaen hakemus-prosessissa, tarkkaillen markkinoita sekä arvioiden yrityksen oman immateriaalioikeussalkun erilaisia hyväksikäyttömahdollisuuksia.
Resumo:
Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.
Resumo:
We investigated the effects of five allyl esters, two aromatic (allyl cinnamate and allyl 2-furoate) and three aliphatic (allyl hexanoate, allyl heptanoate, and allyl octanoate) in established insect cell lines derived from different species and tissues. We studied embryonic cells of the fruit fly Drosophila melanogaster (S2) (Diptera) and the beet armyworm Spodoptera exigua (Se4) (Lepidoptera), fat body cells of the Colorado potato beetle Leptinotarsa decemlineata (CPB) (Coleoptera), ovarian cells of the silkmoth Bombyx mori (Bm5), and midgut cells of the spruce budworm Choristoneura fumiferana (CF203) (Lepidoptera). Cytotoxicity was determined with use of MTT [3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl tetrazolium bromide] and trypan blue. In addition, we tested the entomotoxic action of allyl cinnamate against the cotton leafworm Spodoptera littoralis .The median (50%) cytotoxic concentrations (EC50s) of the five allyl esters in the MTT bioassays ranged between 0.25 and 27 mM with significant differences among allyl esters (P = 0.0012), cell lines (P < 0.0001), and the allyl estercell line interaction (P < 0.0001). Allyl cinnamate was the most active product, and CF203 the most sensitive cell line. In the trypan blue bioassays, cytotoxicity was produced rapidly and followed the same trend observed in the MTT bioassay. In first instars of S. littoralis, allyl cinnamate killed all larvae at 0.25% in the diet after 1 day, while this happened in third instars after 5 days. The LC50 in first instars was 0.08%. In addition, larval weight gain was reduced (P < 0.05) after 1 day of feeding on diet with 0.05%. In conclusion, the data provide evidence of the significant but differential cytotoxicity among allyl esters in insect cells of different species and tissues. Midgut cells show high sensitivity, indicating the insect midgut as a primary target tissue. Allyl cinnamate caused rapid toxic effects in S. littoralis larvae at low concentrations, suggesting further potential for use in pest control.