736 resultados para Open Business Model
Resumo:
[ES] El objetivo de este TFG es la obtención de un modelo de negocio de una asesoría informática para PYMES, centrada en el sector turístico, mediante la metodología de desarrollo del cliente utilizando el famoso método del lienzo creado por Alexander Osterwalder. Para la obtención del mismo se han realizado encuestas a cuarenta negocios de la isla de Gran Canaria. El proceso seguido ha consistido en someter el lienzo de trabajo a modificaciones, a fin de obtener hipótesis verificadas y poder crear un catálogo de servicios. En este proyecto se han realizado tres etapas diferentes. En la primera etapa del proceso se realizaron veintidós encuestas, siendo los resultados obtenidos modificaciones en seis de los nueve bloques que forman el lienzo. En la segunda etapa se realizaron dieciséis encuestas, confirmándose los aspectos modificados en la etapa anterior y refinándose aspectos de diferentes bloques del modelo de negocio. Finalmente en la tercera etapa se realizaron catorce encuestas. Los resultados obtenidos no hicieron variar significativamente el modelo de negocio, con lo que se dio por concluido el proceso. El resultado obtenido es el modelo de negocio validado y verificado, así como un catálogo de servicios definidos.
Resumo:
This doctoral work gains deeper insight into the dynamics of knowledge flows within and across clusters, unfolding their features, directions and strategic implications. Alliances, networks and personnel mobility are acknowledged as the three main channels of inter-firm knowledge flows, thus offering three heterogeneous measures to analyze the phenomenon. The interplay between the three channels and the richness of available research methods, has allowed for the elaboration of three different papers and perspectives. The common empirical setting is the IT cluster in Bangalore, for its distinguished features as a high-tech cluster and for its steady yearly two-digit growth around the service-based business model. The first paper deploys both a firm-level and a tie-level analysis, exploring the cases of 4 domestic companies and of 2 MNCs active the cluster, according to a cluster-based perspective. The distinction between business-domain knowledge and technical knowledge emerges from the qualitative evidence, further confirmed by quantitative analyses at tie-level. At firm-level, the specialization degree seems to be influencing the kind of knowledge shared, while at tie-level both the frequency of interaction and the governance mode prove to determine differences in the distribution of knowledge flows. The second paper zooms out and considers the inter-firm networks; particularly focusing on the role of cluster boundary, internal and external networks are analyzed, in their size, long-term orientation and exploration degree. The research method is purely qualitative and allows for the observation of the evolving strategic role of internal network: from exploitation-based to exploration-based. Moreover, a causal pattern is emphasized, linking the evolution and features of the external network to the evolution and features of internal network. The final paper addresses the softer and more micro-level side of knowledge flows: personnel mobility. A social capital perspective is here developed, which considers both employees’ acquisition and employees’ loss as building inter-firm ties, thus enhancing company’s overall social capital. Negative binomial regression analyses at dyad-level test the significant impact of cluster affiliation (cluster firms vs non-cluster firms), industry affiliation (IT firms vs non-IT fims) and foreign affiliation (MNCs vs domestic firms) in shaping the uneven distribution of personnel mobility, and thus of knowledge flows, among companies.
Resumo:
The scope of this project is to study the effectiveness of building information modelling (BIM) in performing life cycle assessment in a building. For the purposes of the study will be used “Revit” which is a BIM software and Tally which is an LCA tool integrated in Revit. The project is divided in six chapters. The first chapter consists of a theoretical introduction into building information modelling and its connection to life cycle assessment. The second chapter describes the characteristics of building information modelling (BIM). In addition, a comparison has been made with the traditional architectural, engineering and construction business model and the benefits to shift into BIM. In the third chapter it will be a review of the most well-known and available BIM software in the market. In chapter four life cycle assessment (LCA) will be described in general and later on specifically for the purpose of the case study that will be used in the following chapter. Moreover, the tools that are available to perform an LCA will be reviewed. Chapter five will present the case study that consists of a model in a BIM software (Revit) and the LCA performed by Tally, an LCA tool integrated into Revit. In the last chapter will be a discussion of the results that were obtained, the limitation and the possible future improvement in performing life cycle assessment (LCA) in a BIM model.
Resumo:
Questo lavoro di ricerca studia il mercato digitale delle Start-Up italiane, valutando i passi del management nei vari cicli di vita aziendale. Dopo uno studio teorico sull'ecosistema italiano startup e il cambiamento dell'approccio scelto dal manager all'interno dell'azienda per arrivare al successo, è stato fatto un studio empirico con questionari e interviste. L'intento è capire in un team aziendale quali sono le figure da seguire e con che importanza. Quanto è fondamentale la figura di colui che organizza, media e detiene la responsabilità del lavoro finito nei tempi prestabiliti?
Resumo:
The hERG voltage-gated potassium channel mediates the cardiac I(Kr) current, which is crucial for the duration of the cardiac action potential. Undesired block of the channel by certain drugs may prolong the QT interval and increase the risk of malignant ventricular arrhythmias. Although the molecular determinants of hERG block have been intensively studied, not much is known about its stereoselectivity. Levo-(S)-bupivacaine was the first drug reported to have a higher affinity to block hERG than its enantiomer. This study strives to understand the principles underlying the stereoselectivity of bupivacaine block with the help of mutagenesis analyses and molecular modeling simulations. Electrophysiological measurements of mutated hERG channels allowed for the identification of residues involved in bupivacaine binding and stereoselectivity. Docking and molecular mechanics simulations for both enantiomers of bupivacaine and terfenadine (a non-stereoselective blocker) were performed inside an open-state model of the hERG channel. The predicted binding modes enabled a clear depiction of ligand-protein interactions. Estimated binding affinities for both enantiomers were consistent with electrophysiological measurements. A similar computational procedure was applied to bupivacaine enantiomers towards two mutated hERG channels (Tyr652Ala and Phe656Ala). This study confirmed, at the molecular level, that bupivacaine stereoselectively binds the hERG channel. These results help to lay the foundation for structural guidelines to optimize the cardiotoxic profile of drug candidates in silico.
Resumo:
High concentrations of fluoride naturally occurring in the ground water in the Arusha region of Tanzania cause dental, skeletal and non-skeletal fluorosis in up to 90% of the region’s population [1]. Symptoms of this incurable but completely preventable disease include brittle, discolored teeth, malformed bones and stiff and swollen joints. The consumption of high fluoride water has also been proven to cause headaches and insomnia [2] and adversely affect the development of children’s intelligence [3, 4]. Despite the fact that this array of symptoms may significantly impact a society’s development and the citizens’ ability to perform work and enjoy a reasonable quality of life, little is offered in the Arusha region in the form of solutions for the poor, those hardest hit by the problem. Multiple defluoridation technologies do exist, yet none are successfully reaching the Tanzanian public. This report takes a closer look at the efforts of one local organization, the Defluoridation Technology Project (DTP), to address the region’s fluorosis problem through the production and dissemination of bone char defluoridation filters, an appropriate technology solution that is proven to work. The goal of this research is to improve the sustainability of DTP’s operations and help them reach a wider range of clients so that they may reduce the occurrence of fluorosis more effectively. This was done first through laboratory testing of current products. Results of this testing show a wide range in uptake capacity across batches of bone char emphasizing the need to modify kiln design in order to produce a more consistent and high quality product. The issue of filter dissemination was addressed through the development of a multi-level, customerfunded business model promoting the availability of filters to Tanzanians of all socioeconomic levels. Central to this model is the recommendation to focus on community managed, institutional sized filters in order to make fluoride free water available to lower income clients and to increase Tanzanian involvement at the management level.
Resumo:
Sensor networks have been an active research area in the past decade due to the variety of their applications. Many research studies have been conducted to solve the problems underlying the middleware services of sensor networks, such as self-deployment, self-localization, and synchronization. With the provided middleware services, sensor networks have grown into a mature technology to be used as a detection and surveillance paradigm for many real-world applications. The individual sensors are small in size. Thus, they can be deployed in areas with limited space to make unobstructed measurements in locations where the traditional centralized systems would have trouble to reach. However, there are a few physical limitations to sensor networks, which can prevent sensors from performing at their maximum potential. Individual sensors have limited power supply, the wireless band can get very cluttered when multiple sensors try to transmit at the same time. Furthermore, the individual sensors have limited communication range, so the network may not have a 1-hop communication topology and routing can be a problem in many cases. Carefully designed algorithms can alleviate the physical limitations of sensor networks, and allow them to be utilized to their full potential. Graphical models are an intuitive choice for designing sensor network algorithms. This thesis focuses on a classic application in sensor networks, detecting and tracking of targets. It develops feasible inference techniques for sensor networks using statistical graphical model inference, binary sensor detection, events isolation and dynamic clustering. The main strategy is to use only binary data for rough global inferences, and then dynamically form small scale clusters around the target for detailed computations. This framework is then extended to network topology manipulation, so that the framework developed can be applied to tracking in different network topology settings. Finally the system was tested in both simulation and real-world environments. The simulations were performed on various network topologies, from regularly distributed networks to randomly distributed networks. The results show that the algorithm performs well in randomly distributed networks, and hence requires minimum deployment effort. The experiments were carried out in both corridor and open space settings. A in-home falling detection system was simulated with real-world settings, it was setup with 30 bumblebee radars and 30 ultrasonic sensors driven by TI EZ430-RF2500 boards scanning a typical 800 sqft apartment. Bumblebee radars are calibrated to detect the falling of human body, and the two-tier tracking algorithm is used on the ultrasonic sensors to track the location of the elderly people.
Resumo:
BACKGROUND: Patients undergoing laparoscopic Roux-en-Y gastric bypass (LRYGB) often have substantial comorbidities, which must be taken into account to appropriately assess expected postoperative outcomes. The Charlson/Deyo and Elixhauser indices are widely used comorbidity measures, both of which also have revised algorithms based on enhanced ICD-9-CM coding. It is currently unclear which of the existing comorbidity measures best predicts early postoperative outcomes following LRYGB. METHODS: Using the Nationwide Inpatient Sample, patients 18 years or older undergoing LRYGB for obesity between 2001 and 2008 were identified. Comorbidities were assessed according to the original and enhanced Charlson/Deyo and Elixhauser indices. Using multivariate logistic regression, the following early postoperative outcomes were assessed: overall postoperative complications, length of hospital stay, and conversion to open surgery. Model performance for the four comorbidity indices was assessed and compared using C-statistics and the Akaike's information criterion (AIC). RESULTS: A total of 70,287 patients were included. Mean age was 43.1 years (SD, 10.8), 81.6 % were female and 60.3 % were White. Both the original and enhanced Elixhauser indices modestly outperformed the Charlson/Deyo in predicting the surgical outcomes. All four models had similar C-statistics, but the original Elixhauser index was associated with the smallest AIC for all of the surgical outcomes. CONCLUSIONS: The original Elixhauser index is the best predictor of early postoperative outcomes in our cohort of patients undergoing LRYGB. However, differences between the Charlson/Deyo and Elixhauser indices are modest, and each of these indices provides clinically relevant insight for predicting early postoperative outcomes in this high-risk patient population.
Resumo:
We study the labor market effects of realignment in fixed bilateral exchange rates, such as China's peg to the US dollar. We employ the open economy model by de Melo and Robinson to identify the core parameters of the real, trade side of the economy driving the unemployment effects of bilateral exchange rate realignment. A small open economy version of the model is explored analytically and a large multicountry version numerically. Analytics in the small open economy model show that unemployment effects of adjusting of a bilateral peg hinge on the fraction exported to and imported from the trading partner. A larger fraction exported to and a smaller fraction imported from the trading partner make it more likely that revaluation of a trading partner's currency has beneficial effects. Numerics in the large economy model show that Chinese revaluation can generate both positive and negative unemployment effects depending upon underlying parameter values. Adverse unemployment effects can go along with an improving trade balance.
Resumo:
This paper shows that countries characterized by a financial accelerator mechanism may reverse the usual finding of the literature -- flexible exchange rate regimes do a worse job of insulating open economies from external shocks. I obtain this result with a calibrated small open economy model that endogenizes foreign interest rates by linking them to the banking sector's foreign currency leverage. This relationship renders exchange rate policy more important compared to the usual exogeneity assumption. I find empirical support for this prediction using the Local Projections method. Finally, 2nd order approximation to the model finds larger welfare losses under flexible regimes.
Resumo:
Health Information Exchange (HIE) will play a key part in our nation’s effort to improve healthcare. The evidence of HIEs transformational role in healthcare delivery systems is quite limited. The lack of such evidence led us to explore what exists in the healthcare industry that may provide evidence of effectiveness and efficiency of HIEs. The objective of the study was to find out how many fully functional HIEs are using any measurements or metrics to gauge impact of HIE on quality improvement (QI) and on return on investment (ROI).^ A web-based survey was used to determine the number of operational HIEs using metrics for QI and ROI. Our study highlights the fact that only 50 percent of the HIEs who responded use or plan to use metrics. However, 95 percent of the respondents believed HIEs improve quality of care while only 56 percent believed HIE showed positive ROI. Although operational HIEs present numerous opportunities to demonstrate the business model for improving health care quality, evidence to document the impact of HIEs is lacking. ^
Resumo:
En este artículo se presenta una serie de reflexiones que el equipo de cátedra ha realizado en base a un análisis llevado a cabo en el marco del Programa La Cátedra Investiga, con el fin de pensar críticamente el proceso de supervisión docente que se lleva a cabo en la Carrera de Trabajo Social. La Supervisión Docente es una instancia de aprendizaje fundamental que, a partir de la práctica particular de los alumnos, intenta una articulación teórica-práctica, consolidar el yo profesional, develar e interpelar la complejidad de la intervención profesional en la actualidad. En consecuencia se presenta un modelo operativo de supervisión docente.
Resumo:
In recent years, a variety of systems have been developed that export the workflows used to analyze data and make them part of published articles. We argue that the workflows that are published in current approaches are dependent on the specific codes used for execution, the specific workflow system used, and the specific workflow catalogs where they are published. In this paper, we describe a new approach that addresses these shortcomings and makes workflows more reusable through: 1) the use of abstract workflows to complement executable workflows to make them reusable when the execution environment is different, 2) the publication of both abstract and executable workflows using standards such as the Open Provenance Model that can be imported by other workflow systems, 3) the publication of workflows as Linked Data that results in open web accessible workflow repositories. We illustrate this approach using a complex workflow that we re-created from an influential publication that describes the generation of 'drugomes'.
Resumo:
Compared to the size of the microfinance market, the number of Microfinance Institutions that are professionally ran like commercial banks is still scarce, and even more scarce are the MFI listed in public stock exchanges. This document focuses on four listed MFIs and reviews its business model and funding sources. The document also analyses the market price evolution of the listed shares and investigates whether investors are assigning a premium to the MFIs compared with its respective market indices. Keywords: Microfinance institutions, Micro-credits, Financial Institutions, Equity; Stock Exchange.
Resumo:
Technological progress has profoundly changed the way personal data are collected, accessed and used. Those data make possible unprecedented customization of advertising which, in turn, is the business model adopted by many of the most successful Internet companies. Yet measuring the value being generated is still a complex task. This paper presents a review of the literature on this subject. It has been found that the economic analysis of personal information has been conducted up to now from a qualitative perspective mainly linked to privacy issues. A better understanding of a quantitative approach to this topic is urgently needed.