908 resultados para Capability Maturity Model for Software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The DNA replication polymerases δ and ϵ have an inherent proofreading mechanism in the form of a 3'→5' exonuclease. Upon recognition of errant deoxynucleotide incorporation into DNA, the nascent primer terminus is partitioned to the exonuclease active site where the incorrectly paired nucleotide is excised before resumption of polymerization. The goal of this project was to identify the cellular and molecular consequences of an exonuclease deficiency. The proofreading capability of model system MEFs with EXOII mutations was abolished without altering polymerase function.^ It was hypothesized that 3'→5' exonucleases of polymerases δ and ϵ are critical for prevention of replication stress and important for sensitization to nucleoside analogs. To test this hypothesis, two aims were formulated: Determine the effect of the exonuclease active site mutation on replication related molecular signaling and identify the molecular consequences of an exonuclease deficiency when replication is challenged with nucleoside analogs.^ Via cell cycle studies it was determined that larger populations of exonuclease deficient cells are in the S-phase. There was an increase in levels of replication proteins, cell population growth and DNA synthesis capacity without alteration in cell cycle progression. These findings led to studies of proteins involved in checkpoint activation and DNA damage sensing. Finally, collective modifications at the level of DNA replication likely affect the strand integrity of DNA at the chromosomal level.^ Gemcitabine, a DNA directed nucleoside analog is a substrate of polymerases δ and ϵ and exploits replication to become incorporated into DNA. Though accumulation of gemcitabine triphosphate was similar in all cell types, incorporation into DNA and rates of DNA synthesis were increased in exonuclease defective cells and were not consistent with clonogenic survival. This led to molecular signaling investigations which demonstrated an increase in S-phase cells and activation of a DNA damage response upon gemcitabine treatment.^ Collectively, these data indicate that the loss of exonuclease results in a replication stress response that is likely required to employ other repair mechanisms to remove unexcised mismatches introduced into DNA during replication. When challenged with nucleoside analogs, this ongoing stress response coupled with repair serves as a resistance mechanism to cell death.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El trabajo ha sido realizado dentro del marco de los proyectos EURECA (Enabling information re-Use by linking clinical REsearch and Care) e INTEGRATE (Integrative Cancer Research Through Innovative Biomedical Infrastructures), en los que colabora el Grupo de Informática Biomédica de la UPM junto a otras universidades e instituciones sanitarias europeas. En ambos proyectos se desarrollan servicios e infraestructuras con el objetivo principal de almacenar información clínica, procedente de fuentes diversas (como por ejemplo de historiales clínicos electrónicos de hospitales, de ensayos clínicos o artículos de investigación biomédica), de una forma común y fácilmente accesible y consultable para facilitar al máximo la investigación de estos ámbitos, de manera colaborativa entre instituciones. Esta es la idea principal de la interoperabilidad semántica en la que se concentran ambos proyectos, siendo clave para el correcto funcionamiento del software del que se componen. El intercambio de datos con un modelo de representación compartido, común y sin ambigüedades, en el que cada concepto, término o dato clínico tendrá una única forma de representación. Lo cual permite la inferencia de conocimiento, y encaja perfectamente en el contexto de la investigación médica. En concreto, la herramienta a desarrollar en este trabajo también está orientada a la idea de maximizar la interoperabilidad semántica, pues se ocupa de la carga de información clínica con un formato estandarizado en un modelo común de almacenamiento de datos, implementado en bases de datos relacionales. El trabajo ha sido desarrollado en el periodo comprendido entre el 3 de Febrero y el 6 de Junio de 2014. Se ha seguido un ciclo de vida en cascada para la organización del trabajo realizado en las tareas de las que se compone el proyecto, de modo que una fase no puede iniciarse sin que se haya terminado, revisado y aceptado la fase anterior. Exceptuando la tarea de documentación del trabajo (para la elaboración de esta memoria), que se ha desarrollado paralelamente a todas las demás. ----ABSTRACT--- The project has been developed during the second semester of the 2013/2014 academic year. This Project has been done inside EURECA and INTEGRATE European biomedical research projects, where the GIB (Biomedical Informatics Group) of the UPM works as a partner. Both projects aim is to develop platforms and services with the main goal of storing clinical information (e.g. information from hospital electronic health records (EHRs), clinical trials or research articles) in a common way and easy to access and query, in order to support medical research. The whole software environment of these projects is based on the idea of semantic interoperability, which means the ability of computer systems to exchange data with unambiguous and shared meaning. This idea allows knowledge inference, which fits perfectly in medical research context. The tool to develop in this project is also "semantic operability-oriented". Its purpose is to store standardized clinical information in a common data model, implemented in relational databases. The project has been performed during the period between February 3rd and June 6th, of 2014. It has followed a "Waterfall model" of software development, in which progress is seen as flowing steadily downwards through its phases. Each phase starts when its previous phase has been completed and reviewed. The task of documenting the project‟s work is an exception; it has been performed in a parallel way to the rest of the tasks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tecnología de la información y su utilización en las empresas ha sido y es un elemento de debate ya que las organizaciones cuentan con grandes éxitos y grandes fracasos. Gran parte de estos últimos asociados a una falta de visión estratégica integral en relación a la utilización de TI en la organización lo que ha dado lugar al área del conocimiento llamada Gobernanza de TI y a la publicación de numerosos marcos y estándares. Un campo sensible y económicamente muy importante tanto para Uruguay como a nivel mundial son las empresas medianas y pequeñas (PyMEs), las que no logran aplicar de manera efectiva los marcos y estándares, es por esta razón que el objetivo final de esta tesis es el de construir un marco que permita a las PyMEs incorporar un marco efectivo para gobernar y gestionar TI adecuadamente obteniendo el valor esperado de las inversiones realizadas. Para alcanzar este objetivo general se ha realizado un estudio de campo que permita conocer la situación de la gobernanza y la gestión de TI en PyMEs del Uruguay; a partir de este estudio se pudo analizar cuáles son los factores más significativos que no permiten la correcta aplicación de buenas prácticas de gobernanza de TI en éstas empresas. Los resultados encontrados llevaron a la construcción de un marco de gobernanza de TI con foco en PyMEs, a la definición de un modelo de madurez asociado al marco y a una guía de implantación. En el marco de gobernanza propuesto, compatible con el estándar ISO/IEC 38500:2008, se han fortalecido los procesos que, por las características propias de las PyMEs presentan debilidades estructurales y se han reducido o eliminado aquellos que por las mismas razones no son aplicables a este tipo de organización. Finalmente se validaron los resultados en un entorno empresarial definiendo un estudio de caso. Los resultados obtenidos con una mejora porcentual consolidada del 46% en el conjunto de indicadores definidos llevan a considerar que la aplicación del marco fue exitosa. Por ser un estudio de caso único, los resultados no deben ser generalizados y una oportunidad de trabajo futuro es replicar el mismo estudio en otras empresas. ABSTRACT Information technology (IT) and its use in the enterprise context is a discussion element because organizations have numerous successes and failures using it. Most of the IT failed projects have a lack of integral strategic vision in relation with IT use in the organization. This fact has resulted in the IT Governance (ITG) area of knowledge. A large number of standards and frameworks have been published in relation with it. Small and medium enterprises (SMEs) are an important and sensible field in all economies around the world, particularly in Uruguay, an underdeveloped country of South America. Commonly, SMEs cannot apply successfully ITG frameworks because of the intrinsic complexity or because a lack of knowledge and culture respect this field, so the objective of this thesis is build a framework that allows SMEs to incorporate an effective framework to govern and manage IT properly, helping enterprises get the expected value of its IT investments. In a first place, has been conducted a field study to know the quality of actual practices relatively to the ITG and IG management in Uruguayan SMEs. With the obtained results in the study we can make a diagnostic of the most significant factors that prevent the proper application of good IT governance practices in these companies. The obtained results were the inputs to the definition of a IT governance framework with focus in SMEs, a maturity model associated with it and a implementation guide. The proposed framework is ISO/IEC 38500 standard compatible always with an SME vision so, sensible and weak processes have been strengthened and other ones have been eliminated because have no application in these type of organization. Finally the results were validated in a business environment by defining a case study. The results obtained with a consolidated percentage improvement of 46% in the defined set of indicators suggest that the implementation of the framework was successful. As a single case study, the results should not be generalized and an opportunity for future work is to replicate the same study in other companies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dado el impacto que las redes sociales tienen en la vida diaria de los países desarrollados o en vías de desarrollo, éstas han pasado a tener un papel clave en dicho desarrollo social. Dicha consideración no sólo debe centrarse en cómo las personas forman las redes, sino las herramientas que emplean y la forma en la que estas deben ser tratadas por las organizaciones para conseguir una posición preferencial con los usuarios que forman las redes (considerando el creciente número de éstos todos los días). El presente trabajo trata de exponer la diferencia entre medios sociales y redes sociales, estableciendo una diferencia clara entre entre ambos, define lo que son los medios sociales (Social Media en Inglés), qué se debe considerar para que el uso de los mismos tenga una carácter exitoso en sus operaciones y cómo las organizaciones perciben la diferencia competitiva que éstos aportan en sus actividades. Una vez definida, destacamos la importancia de la consideración de estos nuevos medios en las estrategias de la compañías. Para ello, debemos ver el ecosistema de los medios sociales de forma general, y focalizarnos en la relación marca-compañía con el usuario/cliente. La inclusión de los medios sociales en las estrategias de las compañías, primero de forma independiente y, posteriormente, de forma integrada, hace que los modelos de negocio de las compañías se vayan adaptando a los tiempos. Se describe el cambio de paradigma de los modelos de negocio afectados por la introducción de los medios sociales, los elementos y tipos de modelos de negocio que se pueden tener, así como la adaptación de los modelos establecidos a los nuevos modelos. Posteriormente se ve cómo las compañías incluyen los medios sociales en su estrategia, a través de una planificación de medios sociales,partiendo de qué es una estrategia y cómo debe evaluarse. Una vez se ha definido el contexto (qué son los medios sociales, redes sociales, modelo de negocio, estrategia; así como sus características), se definen los bloques funcionales de los medios sociales, con su paralelismo en términos de la estrategia de las compañías, así como se indican determinados factores de éxito para su adopción. Hasta ahora, estamos mirando la organización de forma individual pero, dentro del mercado en el que desarrollan sus actividades, éstas deben ser evaluadas sobre el grado de desarrollo de los medios sociales en sus operaciones; y poder establecer así comparativas, con otras organizaciones, en relación a su grado de implantación. Con dicho objetivo, desarrollaremos un modelo de madurez de medios sociales (Social Media Maturity Model, SMMM o SM3), de forma teórica. ¿Cómo considerar dicho modelo de forma realista?. Basándonos en el método del estudio de casos, se realizará una análisis e investigación de diferentes organizaciones que nos indicará el grado de aproximación del modelo de madurez referenciado, con respecto a la realidad. ABSTRACT Considering the impact that social networks have in the daily life in developed or developing countries, they have come to play a key role in this social development. This consideration should not only focus on how people set up networks, but the tools they use and how these ones should be addressed by organizations to achieve a preferential position with users, forming networks (considering the increasing number of them every day). This work tries to explain the difference between social media and social networking, establishing a clear difference between them, defines what is Social Media, which should be considered for its use has a successful character in their operations and how organizations perceive the competitive edge they bring in their activities. Once they are defined, we remark the importance of considering these new media in companies strategies. For this, we see the social media ecosystem in general, and to focus on brandcompany relationship with the user/client. The inclusion of social media strategies in the companies, independently and in a integrated way, makes the business models of companies evolve along the time. It is described the paradigm shift in business models affected by the introduction of social media, elements and types of business models that can be had, and the adaptation of established models to new models. After that, it’s shown how companies include social media strategy through social media planning and building on what is a strategy and how it should be evaluated. Once the context is defined (what is social media, social networking, business model, strategy, and its features), the functional blocks of social media are defined, with its parallelism in terms of the strategy of companies and specific success factors are indicated. So far, we are looking at the organization individually but within the market in which they operate, they must be evaluated on the degree of development of social media in their operations; and to establish and compare with other organizations in relation to their degree of implementation. With this goal, we will develop a maturity model for social media (Social Media Maturity Model, SMMM or SM3), theoretically. How to consider the model realistically?. Based on the case study method, the analysis and research of different organizations that it will indicate the accuracy of the maturity model referenced with respect to the actually performed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El trabajo se enmarca dentro de los proyecto INTEGRATE y EURECA, cuyo objetivo es el desarrollo de una capa de interoperabilidad semántica que permita la integración de datos e investigación clínica, proporcionando una plataforma común que pueda ser integrada en diferentes instituciones clínicas y que facilite el intercambio de información entre las mismas. De esta manera se promueve la mejora de la práctica clínica a través de la cooperación entre instituciones de investigación con objetivos comunes. En los proyectos se hace uso de estándares y vocabularios clínicos ya existentes, como pueden ser HL7 o SNOMED, adaptándolos a las necesidades particulares de los datos con los que se trabaja en INTEGRATE y EURECA. Los datos clínicos se representan de manera que cada concepto utilizado sea único, evitando ambigüedades y apoyando la idea de plataforma común. El alumno ha formado parte de un equipo de trabajo perteneciente al Grupo de Informática de la UPM, que a su vez trabaja como uno de los socios de los proyectos europeos nombrados anteriormente. La herramienta desarrollada, tiene como objetivo realizar tareas de homogenización de la información almacenada en las bases de datos de los proyectos haciendo uso de los mecanismos de normalización proporcionados por el vocabulario médico SNOMED-CT. Las bases de datos normalizadas serán las utilizadas para llevar a cabo consultas por medio de servicios proporcionados en la capa de interoperabilidad, ya que contendrán información más precisa y completa que las bases de datos sin normalizar. El trabajo ha sido realizado entre el día 12 de Septiembre del año 2014, donde comienza la etapa de formación y recopilación de información, y el día 5 de Enero del año 2015, en el cuál se termina la redacción de la memoria. El ciclo de vida utilizado ha sido el de desarrollo en cascada, en el que las tareas no comienzan hasta que la etapa inmediatamente anterior haya sido finalizada y validada. Sin embargo, no todas las tareas han seguido este modelo, ya que la realización de la memoria del trabajo se ha llevado a cabo de manera paralela con el resto de tareas. El número total de horas dedicadas al Trabajo de Fin de Grado es 324. Las tareas realizadas y el tiempo de dedicación de cada una de ellas se detallan a continuación:  Formación. Etapa de recopilación de información necesaria para implementar la herramienta y estudio de la misma [30 horas.  Especificación de requisitos. Se documentan los diferentes requisitos que ha de cumplir la herramienta [20 horas].  Diseño. En esta etapa se toman las decisiones de diseño de la herramienta [35 horas].  Implementación. Desarrollo del código de la herramienta [80 horas].  Pruebas. Etapa de validación de la herramienta, tanto de manera independiente como integrada en los proyectos INTEGRATE y EURECA [70 horas].  Depuración. Corrección de errores e introducción de mejoras de la herramienta [45 horas].  Realización de la memoria. Redacción de la memoria final del trabajo [44 horas].---ABSTRACT---This project belongs to the semantic interoperability layer developed in the European projects INTEGRATE and EURECA, which aims to provide a platform to promote interchange of medical information from clinical trials to clinical institutions. Thus, research institutions may cooperate to enhance clinical practice. Different health standards and clinical terminologies has been used in both INTEGRATE and EURECA projects, e.g. HL7 or SNOMED-CT. These tools have been adapted to the projects data requirements. Clinical data are represented by unique concepts, avoiding ambiguity problems. The student has been working in the Biomedical Informatics Group from UPM, partner of the INTEGRATE and EURECA projects. The tool developed aims to perform homogenization tasks over information stored in databases of the project, through normalized representation provided by the SNOMED-CT terminology. The data query is executed against the normalized version of the databases, since the information retrieved will be more informative than non-normalized databases. The project has been performed from September 12th of 2014, when initiation stage began, to January 5th of 2015, when the final report was finished. The waterfall model for software development was followed during the working process. Therefore, a phase may not start before the previous one finishes and has been validated, except from the final report redaction, which has been carried out in parallel with the others phases. The tasks that have been developed and time for each one are detailed as follows:  Training. Gathering the necessary information to develop the tool [30 hours].  Software requirement specification. Requirements the tool must accomplish [20 hours].  Design. Decisions on the design of the tool [35 hours].  Implementation. Tool development [80 hours].  Testing. Tool evaluation within the framework of the INTEGRATE and EURECA projects [70 hours].  Debugging. Improve efficiency and correct errors [45 hours].  Documenting. Final report elaboration [44 hours].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesis está focalizada en la resolución de problemas de optimización combinatoria, haciendo uso de las opciones tecnológicas actuales que ofrecen las tecnologías de la información y las comunicaciones, y la investigación operativa. Los problemas de optimización combinatoria se resuelven en general mediante programación lineal y metaheurísticas. La aplicación de las técnicas de resolución de los problemas de optimización combinatoria requiere de una elevada carga computacional, y los algoritmos deben diseñarse, por un lado pensando en la efectividad para encontrar buenas soluciones del problema, y por otro lado, pensando en un uso adecuado de los recursos informáticos disponibles. La programación lineal y las metaheurísticas son técnicas de resolución genéricas, que se pueden aplicar a diferentes problemas, partiendo de una base común que se particulariza para cada problema concreto. En el campo del desarrollo de software, los frameworks cumplen esa función de comenzar un proyecto con el trabajo general ya disponible, con la opción de cambiar o extender ese comportamiento base o genérico, para construir el sistema concreto, lo que permite reducir el tiempo de desarrollo, y amplía las posibilidades de éxito del proyecto. En esta tesis se han desarrollado dos frameworks de desarrollo. El framework ILP permite modelar y resolver problemas de programación lineal, de forma independiente al software de resolución de programación lineal que se utilice. El framework LME permite resolver problemas de optimización combinatoria mediante metaheurísticas. Tradicionalmente, las aplicaciones de resolución de problemas de optimización combinatoria son aplicaciones de escritorio que permiten gestionar toda la información de entrada del problema y resuelven el problema en local, con los recursos hardware disponibles. Recientemente ha aparecido un nuevo paradigma de despliegue y uso de aplicaciones que permite compartir recursos informáticos especializados por Internet. Esta nueva forma de uso de recursos informáticos es la computación en la nube, que presenta el modelo de software como servicio (SaaS). En esta tesis se ha construido una plataforma SaaS, para la resolución de problemas de optimización combinatoria, que se despliega sobre arquitecturas compuestas por procesadores multi-núcleo y tarjetas gráficas, y dispone de algoritmos de resolución basados en frameworks de programación lineal y metaheurísticas. Toda la infraestructura es independiente del problema de optimización combinatoria a resolver, y se han desarrollado tres problemas que están totalmente integrados en la plataforma SaaS. Estos problemas se han seleccionado por su importancia práctica. Uno de los problemas tratados en la tesis, es el problema de rutas de vehículos (VRP), que consiste en calcular las rutas de menor coste de una flota de vehículos, que reparte mercancías a todos los clientes. Se ha partido de la versión más clásica del problema y se han hecho estudios en dos direcciones. Por un lado se ha cuantificado el aumento en la velocidad de ejecución de la resolución del problema en tarjetas gráficas. Por otro lado, se ha estudiado el impacto en la velocidad de ejecución y en la calidad de soluciones, en la resolución por la metaheurística de colonias de hormigas (ACO), cuando se introduce la programación lineal para optimizar las rutas individuales de cada vehículo. Este problema se ha desarrollado con los frameworks ILP y LME, y está disponible en la plataforma SaaS. Otro de los problemas tratados en la tesis, es el problema de asignación de flotas (FAP), que consiste en crear las rutas de menor coste para la flota de vehículos de una empresa de transporte de viajeros. Se ha definido un nuevo modelo de problema, que engloba características de problemas presentados en la literatura, y añade nuevas características, lo que permite modelar los requerimientos de las empresas de transporte de viajeros actuales. Este nuevo modelo resuelve de forma integrada el problema de definir los horarios de los trayectos, el problema de asignación del tipo de vehículo, y el problema de crear las rotaciones de los vehículos. Se ha creado un modelo de programación lineal para el problema, y se ha resuelto por programación lineal y por colonias de hormigas (ACO). Este problema se ha desarrollado con los frameworks ILP y LME, y está disponible en la plataforma SaaS. El último problema tratado en la tesis es el problema de planificación táctica de personal (TWFP), que consiste en definir la configuración de una plantilla de trabajadores de menor coste, para cubrir una demanda de carga de trabajo variable. Se ha definido un modelo de problema muy flexible en la definición de contratos, que permite el uso del modelo en diversos sectores productivos. Se ha definido un modelo matemático de programación lineal para representar el problema. Se han definido una serie de casos de uso, que muestran la versatilidad del modelo de problema, y permiten simular el proceso de toma de decisiones de la configuración de una plantilla de trabajadores, cuantificando económicamente cada decisión que se toma. Este problema se ha desarrollado con el framework ILP, y está disponible en la plataforma SaaS. ABSTRACT The thesis is focused on solving combinatorial optimization problems, using current technology options offered by information technology and communications, and operations research. Combinatorial optimization problems are solved in general by linear programming and metaheuristics. The application of these techniques for solving combinatorial optimization problems requires a high computational load, and algorithms are designed, on the one hand thinking to find good solutions to the problem, and on the other hand, thinking about proper use of the available computing resources. Linear programming and metaheuristic are generic resolution techniques, which can be applied to different problems, beginning with a common base that is particularized for each specific problem. In the field of software development, frameworks fulfill this function that allows you to start a project with the overall work already available, with the option to change or extend the behavior or generic basis, to build the concrete system, thus reducing the time development, and expanding the possibilities of success of the project. In this thesis, two development frameworks have been designed and developed. The ILP framework allows to modeling and solving linear programming problems, regardless of the linear programming solver used. The LME framework is designed for solving combinatorial optimization problems using metaheuristics. Traditionally, applications for solving combinatorial optimization problems are desktop applications that allow the user to manage all the information input of the problem and solve the problem locally, using the available hardware resources. Recently, a new deployment paradigm has appeared, that lets to share hardware and software resources by the Internet. This new use of computer resources is cloud computing, which presents the model of software as a service (SaaS). In this thesis, a SaaS platform has been built for solving combinatorial optimization problems, which is deployed on architectures, composed of multi-core processors and graphics cards, and has algorithms based on metaheuristics and linear programming frameworks. The SaaS infrastructure is independent of the combinatorial optimization problem to solve, and three problems are fully integrated into the SaaS platform. These problems have been selected for their practical importance. One of the problems discussed in the thesis, is the vehicle routing problem (VRP), which goal is to calculate the least cost of a fleet of vehicles, which distributes goods to all customers. The VRP has been studied in two directions. On one hand, it has been quantified the increase in execution speed when the problem is solved on graphics cards. On the other hand, it has been studied the impact on execution speed and quality of solutions, when the problem is solved by ant colony optimization (ACO) metaheuristic, and linear programming is introduced to optimize the individual routes of each vehicle. This problem has been developed with the ILP and LME frameworks, and is available in the SaaS platform. Another problem addressed in the thesis, is the fleet assignment problem (FAP), which goal is to create lower cost routes for a fleet of a passenger transport company. It has been defined a new model of problem, which includes features of problems presented in the literature, and adds new features, allowing modeling the business requirements of today's transport companies. This new integrated model solves the problem of defining the flights timetable, the problem of assigning the type of vehicle, and the problem of creating aircraft rotations. The problem has been solved by linear programming and ACO. This problem has been developed with the ILP and LME frameworks, and is available in the SaaS platform. The last problem discussed in the thesis is the tactical planning staff problem (TWFP), which is to define the staff of lower cost, to cover a given work load. It has been defined a very rich problem model in the definition of contracts, allowing the use of the model in various productive sectors. It has been defined a linear programming mathematical model to represent the problem. Some use cases has been defined, to show the versatility of the model problem, and to simulate the decision making process of setting up a staff, economically quantifying every decision that is made. This problem has been developed with the ILP framework, and is available in the SaaS platform.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O objetivo deste trabalho é analisar o impacto dos Sistemas de Gestão Integrados (SGI) no desempenho organizacional sob a óptica do Triple Bottom Line (TBL), verificando se esta implementação auxilia a empresa a se tornar mais sustentável. A abordagem multi-método utilizada está dividida em três partes. A primeira compreende uma revisão sistemática da literatura, tendo como base a abordagem bibliométrica. A base de dados escolhida para a seleção dos artigos que compõem a amostra foi a ISI Web of Knowledge (Web of Science). As análises conduzidas sugerem lacunas na literatura a serem pesquisadas de modo a relacionar a integração dos sistemas de gestão como meio para as organizações tornarem-se mais sustentáveis, auxiliando assim na elaboração de um modelo teórico e das hipóteses de pesquisa. Os resultados parciais obtidos ressaltam a lacuna na literatura de estudos nessa área, principalmente que contemplem a dimensão social do Triple Bottom Line. Lacunas na literatura foram identificadas também no que se refere à análise do impacto da adoção dessas abordagens normativas no desempenho organizacional. A segunda etapa da metodologia é composta por estudos de casos múltiplos em empresas de diferentes setores e que tenham implantado sistemas de gestão de maneira integrada. Os resultados obtidos mostram que a certificação auxilia no desenvolvimento de ações sustentáveis, resultando em impactos econômicos, ambientais e sociais positivos. Nesta etapa, testou-se o modelo e as hipóteses levantadas na abordagem bibliométrica. A terceira etapa da metodologia é composta por análises estatísticas de dados secundários extraídos da revista Exame ?Maiores e Melhores\'. Os dados do ano de 2014 das empresas foram tratados por meio do software MINITAB 17 ®. Por meio do teste de mediana de mood, as amostras foram testadas e apresentaram diferenças estatisticamente significativas para o desempenho das empresas em diferentes setores. De maneira geral, as empresas com SGI apresentam melhor desempenho econômico do que as demais. Com a mesma base de dados, utilizando o modelo de equações estruturais e o software Smart PLS 2.0, criou-se um diagrama de caminhos analisando os constructos (SGI) com variáveis de desempenho (Endividamento, Lucratividade, Patrimônio, Crescimento e Retorno). O modelo de equações estruturais testado apresentou força para a relação entre SGI com Endividamento, Lucratividade, Patrimônio e Crescimento. As diferentes metodologias apresentadas contribuíram para responder a hipótese e afirmar com base na amostra deste trabalho que o SGI leva as empresas a terem melhor desempenho econômico, ambiental e social (baseado no TBL).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter reports on a framework that has been successfully used to analyze the e-business capabilities of an organization with a view to developing their e-capability maturity levels. This should be the first stage of any systems development project. The framework has been used widely within start-up companies and well-established companies both large and small; it has been deployed in the service and manufacturing sectors. It has been applied by practitioners and consultants to help improve e-business capability levels, and by academics for teaching and research purposes at graduate and undergraduate levels. This chapter will provide an account of the unique e-business planning and analysis framework (E-PAF) and demonstrate how it works via an abridged version of a case study (selected from hundreds that have been produced). This will include a brief account of the three techniques that are integrated to form the analysis framework: quality function deployment (QFD) (Akao, 1972), the balanced scorecard (BSC) (Kaplan & Norton, 1992), and value chain analysis (VCA) (Porter, 1985). The case study extract is based on an online community and dating agency service identified as VirtualCom which has been produced through a consulting assignment with the founding directors of that company and has not been published previously. It has been chosen because it gives a concise, comprehensive example from an industry that is relatively easy to relate to.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper analyses the reengineering concept as it comes from software engineering and management fields. We combine two viewpoints and apply them to solve a problem of reengineering of a distance study system, in general, and the unit of learning, in particular. We propose a framework for reengineering of unit of learning, based on general model of software reengineering, and present a case study, in which we describe, how one topic of distance study course was reengineered, considering triple consistency principle and requirements for computer science. The proposed framework contributes to increasing quality, effectiveness and systematization of delivering distance studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

One of the reasons for using variability in the software product line (SPL) approach (see Apel et al., 2006; Figueiredo et al., 2008; Kastner et al., 2007; Mezini & Ostermann, 2004) is to delay a design decision (Svahnberg et al., 2005). Instead of deciding on what system to develop in advance, with the SPL approach a set of components and a reference architecture are specified and implemented (during domain engineering, see Czarnecki & Eisenecker, 2000) out of which individual systems are composed at a later stage (during application engineering, see Czarnecki & Eisenecker, 2000). By postponing the design decisions in such a manner, it is possible to better fit the resultant system in its intended environment, for instance, to allow selection of the system interaction mode to be made after the customers have purchased particular hardware, such as a PDA vs. a laptop. Such variability is expressed through variation points which are locations in a software-based system where choices are available for defining a specific instance of a system (Svahnberg et al., 2005). Until recently it had sufficed to postpone committing to a specific system instance till before the system runtime. However, in the recent years the use and expectations of software systems in human society has undergone significant changes.Today's software systems need to be always available, highly interactive, and able to continuously adapt according to the varying environment conditions, user characteristics and characteristics of other systems that interact with them. Such systems, called adaptive systems, are expected to be long-lived and able to undertake adaptations with little or no human intervention (Cheng et al., 2009). Therefore, the variability now needs to be present also at system runtime, which leads to the emergence of a new type of system: adaptive systems with dynamic variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing use of model-driven software development has renewed emphasis on using domain-specific models during application development. More specifically, there has been emphasis on using domain-specific modeling languages (DSMLs) to capture user-specified requirements when creating applications. The current approach to realizing these applications is to translate DSML models into source code using several model-to-model and model-to-code transformations. This approach is still dependent on the underlying source code representation and only raises the level of abstraction during development. Experience has shown that developers will many times be required to manually modify the generated source code, which can be error-prone and time consuming. ^ An alternative to the aforementioned approach involves using an interpreted domain-specific modeling language (i-DSML) whose models can be directly executed using a Domain Specific Virtual Machine (DSVM). Direct execution of i-DSML models require a semantically rich platform that reduces the gap between the application models and the underlying services required to realize the application. One layer in this platform is the domain-specific middleware that is responsible for the management and delivery of services in the specific domain. ^ In this dissertation, we investigated the problem of designing the domain-specific middleware of the DSVM to facilitate the bifurcation of the semantics of the domain and the model of execution (MoE) while supporting runtime adaptation and validation. We approached our investigation by seeking solutions to the following sub-problems: (1) How can the domain-specific knowledge (DSK) semantics be separated from the MoE for a given domain? (2) How do we define a generic model of execution (GMoE) of the middleware so that it is adaptable and realizes DSK operations to support delivery of services? (3) How do we validate the realization of DSK operations at runtime? ^ Our research into the domain-specific middleware was done using an i-DSML for the user-centric communication domain, Communication Modeling Language (CML), and for microgrid energy management domain, Microgrid Modeling Language (MGridML). We have successfully developed a methodology to separate the DSK and GMoE of the middleware of a DSVM that supports specialization for a given domain, and is able to perform adaptation and validation at runtime. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Computer-Aided-Design (CAD) and Computer-Aided-Manufacture (CAM) has been developed to fabricate fixed dental restorations accurately, faster and improve cost effectiveness of manufacture when compared to the conventional method. Two main methods exist in dental CAD/CAM technology: the subtractive and additive methods. While fitting accuracy of both methods has been explored, no study yet has compared the fabricated restoration (CAM output) to its CAD in terms of accuracy. The aim of this present study was to compare the output of various dental CAM routes to a sole initial CAD and establish the accuracy of fabrication. The internal fit of the various CAM routes were also investigated. The null hypotheses tested were: 1) no significant differences observed between the CAM output to the CAD and 2) no significant differences observed between the various CAM routes. Methods: An aluminium master model of a standard premolar preparation was scanned with a contact dental scanner (Incise, Renishaw, UK). A single CAD was created on the scanned master model (InciseCAD software, V2.5.0.140, UK). Twenty copings were then fabricated by sending the single CAD to a multitude of CAM routes. The copings were grouped (n=5) as: Laser sintered CoCrMo (LS), 5-axis milled CoCrMo (MCoCrMo), 3-axis milled zirconia (ZAx3) and 4-axis milled zirconia (ZAx4). All copings were micro-CT scanned (Phoenix X-Ray, Nanotom-S, Germany, power: 155kV, current: 60µA, 3600 projections) to produce 3-Dimensional (3D) models. A novel methodology was created to superimpose the micro-CT scans with the CAD (GOM Inspect software, V7.5SR2, Germany) to indicate inaccuracies in manufacturing. The accuracy in terms of coping volume was explored. The distances from the surfaces of the micro-CT 3D models to the surfaces of the CAD model (CAD Deviation) were investigated after creating surface colour deviation maps. Localised digital sections of the deviations (Occlusal, Axial and Cervical) and selected focussed areas were then quantitatively measured using software (GOM Inspect software, Germany). A novel methodology was also explored to digitally align (Rhino software, V5, USA) the micro-CT scans with the master model to investigate internal fit. Fifty digital cross sections of the aligned scans were created. Point-to-point distances were measured at 5 levels at each cross section. The five levels were: Vertical Marginal Fit (VF), Absolute Marginal Fit (AM), Axio-margin Fit (AMF), Axial Fit (AF) and Occlusal Fit (OF). Results: The results of the volume measurement were summarised as: VM-CoCrMo (62.8mm3 ) > VZax3 (59.4mm3 ) > VCAD (57mm3 ) > VZax4 (56.1mm3 ) > VLS (52.5mm3 ) and were all significantly different (p presented as areas with different colour. No significant differences were observed at the internal aspect of the cervical aspect between all groups of copings. Significant differences (p< M-CoCrMo Internal Occlusal, Internal Axial and External Axial 2 ZAx3 > ZAx4 External Occlusal, External Cervical 3 ZAx3 < ZAx4 Internal Occlusal 4 M-CoCrMo > ZAx4 Internal Occlusal and Internal Axial The mean values of AMF and AF were significantly (p M-CoCrMo and CAD > ZAx4. Only VF of M-CoCrMo was comparable with the CAD Internal Fit. All VF and AM values were within the clinically acceptable fit (120µm). Conclusion: The investigated CAM methods reproduced the CAD accurately at the internal cervical aspect of the copings. However, localised deviations at axial and occlusal aspects of the copings may suggest the need for modifications in these areas prior to fitting and veneering with porcelain. The CAM groups evaluated also showed different levels of Internal Fit thus rejecting the null hypotheses. The novel non-destructive methodologies for CAD/CAM accuracy and internal fit testing presented in this thesis may be a useful evaluation tool for similar applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Con la crescita in complessità delle infrastrutture IT e la pervasività degli scenari di Internet of Things (IoT) emerge il bisogno di nuovi modelli computazionali basati su entità autonome capaci di portare a termine obiettivi di alto livello interagendo tra loro grazie al supporto di infrastrutture come il Fog Computing, per la vicinanza alle sorgenti dei dati, e del Cloud Computing per offrire servizi analitici complessi di back-end in grado di fornire risultati per milioni di utenti. Questi nuovi scenarii portano a ripensare il modo in cui il software viene progettato e sviluppato in una prospettiva agile. Le attività dei team di sviluppatori (Dev) dovrebbero essere strettamente legate alle attività dei team che supportano il Cloud (Ops) secondo nuove metodologie oggi note come DevOps. Tuttavia, data la mancanza di astrazioni adeguata a livello di linguaggio di programmazione, gli sviluppatori IoT sono spesso indotti a seguire approcci di sviluppo bottom-up che spesso risulta non adeguato ad affrontare la compessità delle applicazione del settore e l'eterogeneità dei compomenti software che le formano. Poichè le applicazioni monolitiche del passato appaiono difficilmente scalabili e gestibili in un ambiente Cloud con molteplici utenti, molti ritengono necessaria l'adozione di un nuovo stile architetturale, in cui un'applicazione dovrebbe essere vista come una composizione di micro-servizi, ciascuno dedicato a uno specifica funzionalità applicativa e ciascuno sotto la responsabilità di un piccolo team di sviluppatori, dall'analisi del problema al deployment e al management. Poichè al momento non si è ancora giunti a una definizione univoca e condivisa dei microservices e di altri concetti che emergono da IoT e dal Cloud, nè tantomento alla definzione di linguaggi sepcializzati per questo settore, la definzione di metamodelli custom associati alla produzione automatica del software di raccordo con le infrastrutture potrebbe aiutare un team di sviluppo ad elevare il livello di astrazione, incapsulando in una software factory aziendale i dettagli implementativi. Grazie a sistemi di produzione del sofware basati sul Model Driven Software Development (MDSD), l'approccio top-down attualmente carente può essere recuperato, permettendo di focalizzare l'attenzione sulla business logic delle applicazioni. Nella tesi viene mostrato un esempio di questo possibile approccio, partendo dall'idea che un'applicazione IoT sia in primo luogo un sistema software distribuito in cui l'interazione tra componenti attivi (modellati come attori) gioca un ruolo fondamentale.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This keynote presentation will report some of our research work and experience on the development and applications of relevant methods, models, systems and simulation techniques in support of different types and various levels of decision making for business, management and engineering. In particular, the following topics will be covered. Modelling, multi-agent-based simulation and analysis of the allocation management of carbon dioxide emission permits in China (Nanfeng Liu & Shuliang Li Agent-based simulation of the dynamic evolution of enterprise carbon assets (Yin Zeng & Shuliang Li) A framework & system for extracting and representing project knowledge contexts using topic models and dynamic knowledge maps: a big data perspective (Jin Xu, Zheng Li, Shuliang Li & Yanyan Zhang) Open innovation: intelligent model, social media & complex adaptive system simulation (Shuliang Li & Jim Zheng Li) A framework, model and software prototype for modelling and simulation for deshopping behaviour and how companies respond (Shawkat Rahman & Shuliang Li) Integrating multiple agents, simulation, knowledge bases and fuzzy logic for international marketing decision making (Shuliang Li & Jim Zheng Li) A Web-based hybrid intelligent system for combined conventional, digital, mobile, social media and mobile marketing strategy formulation (Shuliang Li & Jim Zheng Li) A hybrid intelligent model for Web & social media dynamics, and evolutionary and adaptive branding (Shuliang Li) A hybrid paradigm for modelling, simulation and analysis of brand virality in social media (Shuliang Li & Jim Zheng Li) Network configuration management: attack paradigms and architectures for computer network survivability (Tero Karvinen & Shuliang Li)