939 resultados para process model collection


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software engineering researchers are challenged to provide increasingly more powerful levels of abstractions to address the rising complexity inherent in software solutions. One new development paradigm that places models as abstraction at the forefront of the development process is Model-Driven Software Development (MDSD). MDSD considers models as first class artifacts, extending the capability for engineers to use concepts from the problem domain of discourse to specify apropos solutions. A key component in MDSD is domain-specific modeling languages (DSMLs) which are languages with focused expressiveness, targeting a specific taxonomy of problems. The de facto approach used is to first transform DSML models to an intermediate artifact in a HLL e.g., Java or C++, then execute that resulting code.^ Our research group has developed a class of DSMLs, referred to as interpreted DSMLs (i-DSMLs), where models are directly interpreted by a specialized execution engine with semantics based on model changes at runtime. This execution engine uses a layered architecture and is referred to as a domain-specific virtual machine (DSVM). As the domain-specific model being executed descends the layers of the DSVM the semantic gap between the user-defined model and the services being provided by the underlying infrastructure is closed. The focus of this research is the synthesis engine, the layer in the DSVM which transforms i-DSML models into executable scripts for the next lower layer to process.^ The appeal of an i-DSML is constrained as it possesses unique semantics contained within the DSVM. Existing DSVMs for i-DSMLs exhibit tight coupling between the implicit model of execution and the semantics of the domain, making it difficult to develop DSVMs for new i-DSMLs without a significant investment in resources.^ At the onset of this research only one i-DSML had been created for the user- centric communication domain using the aforementioned approach. This i-DSML is the Communication Modeling Language (CML) and its DSVM is the Communication Virtual machine (CVM). A major problem with the CVM's synthesis engine is that the domain-specific knowledge (DSK) and the model of execution (MoE) are tightly interwoven consequently subsequent DSVMs would need to be developed from inception with no reuse of expertise.^ This dissertation investigates how to decouple the DSK from the MoE and subsequently producing a generic model of execution (GMoE) from the remaining application logic. This GMoE can be reused to instantiate synthesis engines for DSVMs in other domains. The generalized approach to developing the model synthesis component of i-DSML interpreters utilizes a reusable framework loosely coupled to DSK as swappable framework extensions.^ This approach involves first creating an i-DSML and its DSVM for a second do- main, demand-side smartgrid, or microgrid energy management, and designing the synthesis engine so that the DSK and MoE are easily decoupled. To validate the utility of the approach, the SEs are instantiated using the GMoE and DSKs of the two aforementioned domains and an empirical study to support our claim of reduced developmental effort is performed.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Melanoma is one of the most aggressive types of cancer. It originates from the transformation of melanocytes present in the epidermal/dermal junction of the human skin. It is commonly accepted that melanomagenesis is influenced by the interaction of environmental factors, genetic factors, as well as tumor-host interactions. DNA photoproducts induced by UV radiation are, in normal cells, repaired by the nucleotide excision repair (NER) pathway. The prominent role of NER in cancer resistance is well exemplified by patients with Xeroderma Pigmentosum (XP). This disease results from mutations in the components of the NER pathway, such as XPA and XPC proteins. In humans, NER pathway disruption leads to the development of skin cancers, including melanoma. Similar to humans afflicted with XP, Xpa and Xpc deficient mice show high sensibility to UV light, leading to skin cancer development, except melanoma. The Endothelin 3 (Edn3) signaling pathway is essential for proliferation, survival and migration of melanocyte precursor cells. Excessive production of Edn3 leads to the accumulation of large numbers of melanocytes in the mouse skin, where they are not normally found. In humans, Edn3 signaling pathway has also been implicated in melanoma progression and its metastatic potential. The goal of this study was the development of the first UV-induced melanoma mouse model dependent on the over-expression of Edn3 in the skin. The UV-induced melanoma mouse model reported here is distinguishable from all previous published models by two features: melanocytes are not transformed a priori and melanomagenesis arises only upon neonatal UV exposure. In this model, melanomagenesis depends on the presence of Edn3 in the skin. Disruption of the NER pathway due to the lack of Xpa or Xpc proteins was not essential for melanomagenesis; however, it enhanced melanoma penetrance and decreased melanoma latency after one single neonatal erythemal UV dose. Exposure to a second dose of UV at six weeks of age did not change time of appearance or penetrance of melanomas in this mouse model. Thus, a combination of neonatal UV exposure with excessive Edn3 in the tumor microenvironment is sufficient for melanomagenesis in mice; furthermore, NER deficiency exacerbates this process.^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Innovation is a strategic necessity for the survival of today’s organizations. The wide recognition of innovation as a competitive necessity, particularly in dynamic market environments, makes it an evergreen domain for research. This dissertation deals with innovation in small Information Technology (IT) firms in India. The IT industry in India has been a phenomenal success story of the last three decades, and is today facing a crucial phase in its history characterized by the need for fundamental changes in strategies, driven by innovation. This study, while motivated by the dynamics of changing times, importantly addresses the research gap on small firm innovation in Indian IT.This study addresses three main objectives: (a) drivers of innovation in small IT firms in India (b) impact of innovation on firm performance (c) variation in the extent of innovation adoption in small firms. Product and process innovation were identified as the two most contextually relevant types of innovation for small IT firms. The antecedents of innovation were identified as Intellectual Capital, Creative Capability, Top Management Support, Organization Learning Capability, Customer Involvement, External Networking and Employee Involvement.Survey method was adopted for data collection and the study unit was the firm. Surveys were conducted in 2014 across five South Indian cities. Small firm was defined as one with 10-499 employees. Responses from 205 firms were chosen for analysis. Rigorous statistical analysis was done to generate meaningful insights. The set of drivers of product innovation (Intellectual Capital, Creative Capability, Top Management Support, Customer Involvement, External Networking, and Employee Involvement)were different from that of process innovation (Creative Capability, Organization Learning Capability, External Networking, and Employee Involvement). Both product and process innovation had strong impact on firm performance. It was found that firms that adopted a combination of product innovation and process innovation had the highest levels of firm performance. Product innovation and process innovation fully mediated the relationship between all the seven antecedents and firm performance The results of this study have several important theoretical and practical implications. To the best of the researcher’s knowledge, this is the first time that an empirical study of firm level innovation of this kind has been undertaken in India. A measurement model for product and process innovation was developed, and the drivers of innovation were established statistically. Customer Involvement, External Networking and Employee Involvement are elements of Open Innovation, and all three had strong association with product innovation, and the latter twohad strong association with process innovation. The results showed that proclivity for Open Innovation is healthy in the Indian context. Practical implications have been outlined along how firms can organize themselves for innovation, the human talent for innovation, the right culture for innovation and for open innovation. While some specific examples of possible future studies have been recommended, the researcher believes that the study provides numerous opportunities to further this line of enquiry.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The problem of selecting suppliers/partners is a crucial and important part in the process of decision making for companies that intend to perform competitively in their area of activity. The selection of supplier/partner is a time and resource-consuming task that involves data collection and a careful analysis of the factors that can positively or negatively influence the choice. Nevertheless it is a critical process that affects significantly the operational performance of each company. In this work, trough the literature review, there were identified five broad suppliers selection criteria: Quality, Financial, Synergies, Cost, and Production System. Within these criteria, it was also included five sub-criteria. Thereafter, a survey was elaborated and companies were contacted in order to answer which factors have more relevance in their decisions to choose the suppliers. Interpreted the results and processed the data, it was adopted a model of linear weighting to reflect the importance of each factor. The model has a hierarchical structure and can be applied with the Analytic Hierarchy Process (AHP) method or Simple Multi-Attribute Rating Technique (SMART). The result of the research undertaken by the authors is a reference model that represents a decision making support for the suppliers/partners selection process.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Business Process Management (BPM) is able to organize and frame a company focusing in the improvement or assurance of performance in order to gain competitive advantage. Although it is believed that BPM improves various aspects of organizational performance, there has been a lack of empirical evidence about this. The present study has the purpose to develop a model to show the impact of business process management in organizational performance. To accomplish that, the theoretical basis required to know the elements that configurate BPM and the measures that can evaluate the BPM success on organizational performance is built through a systematic literature review (SLR). Then, a research model is proposed according to SLR results. Empirical data will be collected from a survey of  larg and mid-sized industrial and service companies headquartered in Brazil. A quantitative analysis will be performed using structural equation modeling (SEM) to show if the direct effects among BPM and organizational performance can be considered statistically significant. At the end will discuss these results and their managerial and cientific implications.Keywords: Business process management (BPM). Organizational performance. Firm performance. Business models. Structural Equation Modeling. Systematic Literature Review.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of the study was to explore how a public, IT services transferor, organization, comprised of autonomous entities, can effectively develop and organize its data center cost recovery mechanisms in a fair manner. The lack of a well-defined model for charges and a cost recovery scheme could cause various problems. For example one entity may be subsidizing the costs of another entity(s). Transfer pricing is in the best interest of each autonomous entity in a CCA. While transfer pricing plays a pivotal role in the price settings of services and intangible assets, TCE focuses on the arrangement at the boundary between entities. TCE is concerned with the costs, autonomy, and cooperation issues of an organization. The theory is concern with the factors that influence intra-firm transaction costs and attempting to manifest the problems involved in the determination of the charges or prices of the transactions. This study was carried out, as a single case study, in a public organization. The organization intended to transfer the IT services of its own affiliated public entities and was in the process of establishing a municipal-joint data center. Nine semi-structured interviews, including two pilot interviews, were conducted with the experts and managers of the case company and its affiliating entities. The purpose of these interviews was to explore the charging and pricing issues of the intra-firm transactions. In order to process and summarize the findings, this study employed qualitative techniques with the multiple methods of data collection. The study, by reviewing the TCE theory and a sample of transfer pricing literature, created an IT services pricing framework as a conceptual tool for illustrating the structure of transferring costs. Antecedents and consequences of the transfer price based on TCE were developed. An explanatory fair charging model was eventually developed and suggested. The findings of the study suggested that the Chargeback system was inappropriate scheme for an organization with affiliated autonomous entities. The main contribution of the study was the application of TP methodologies in the public sphere with no tax issues consideration.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research explores the business model (BM) evolution process of entrepreneurial companies and investigates the relationship between BM evolution and firm performance. Recently, it has been increasingly recognised that the innovative design (and re-design) of BMs is crucial to the performance of entrepreneurial firms, as BM can be associated with superior value creation and competitive advantage. However, there has been limited theoretical and empirical evidence in relation to the micro-mechanisms behind the BM evolution process and the entrepreneurial outcomes of BM evolution. This research seeks to fill this gap by opening up the ‘black box’ of the BM evolution process, exploring the micro-patterns that facilitate the continuous shaping, changing, and renewing of BMs and examining how BM evolutions create and capture value in a dynamic manner. Drawing together the BM and strategic entrepreneurship literature, this research seeks to understand: (1) how and why companies introduce BM innovations and imitations; (2) how BM innovations and imitations interplay as patterns in the BM evolution process; and (3) how BM evolution patterns affect firm performances. This research adopts a longitudinal multiple case study design that focuses on the emerging phenomenon of BM evolution. Twelve entrepreneurial firms in the Chinese Online Group Buying (OGB) industry were selected for their continuous and intensive developments of BMs and their varying success rates in this highly competitive market. Two rounds of data collection were carried out between 2013 and 2014, which generates 31 interviews with founders/co-founders and in total 5,034 pages of data. Following a three-stage research framework, the data analysis begins by mapping the BM evolution process of the twelve companies and classifying the changes in the BMs into innovations and imitations. The second stage focuses down to the BM level, which addresses the BM evolution as a dynamic process by exploring how BM innovations and imitations unfold and interplay over time. The final stage focuses on the firm level, providing theoretical explanations as to the effects of BM evolution patterns on firm performance. This research provides new insights into the nature of BM evolution by elaborating on the missing link between BM dynamics and firm performance. The findings identify four patterns of BM evolution that have different effects on a firm’s short- and long-term performance. This research contributes to the BM literature by presenting what the BM evolution process actually looks like. Moreover, it takes a step towards the process theory of the interplay between BM innovations and imitations, which addresses the role of companies’ actions, and more importantly, reactions to the competitors. Insights are also given into how entrepreneurial companies achieve and sustain value creation and capture by successfully combining the BM evolution patterns. Finally, the findings on BM evolution contributes to the strategic entrepreneurship literature by increasing the understanding of how companies compete in a more dynamic and complex environment. It reveals that, the achievement of superior firm performance is more than a simple question of whether to innovate or imitate, but rather an integration of innovation and imitation strategies over time. This study concludes with a discussion of the findings and their implications for theory and practice.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In order to reduce serious health incidents, individuals with high risks need to be identified as early as possible so that effective intervention and preventive care can be provided. This requires regular and efficient assessments of risk within communities that are the first point of contacts for individuals. Clinical Decision Support Systems CDSSs have been developed to help with the task of risk assessment, however such systems and their underpinning classification models are tailored towards those with clinical expertise. Communities where regular risk assessments are required lack such expertise. This paper presents the continuation of GRiST research team efforts to disseminate clinical expertise to communities. Based on our earlier published findings, this paper introduces the framework and skeleton for a data collection and risk classification model that evaluates data redundancy in real-time, detects the risk-informative data and guides the risk assessors towards collecting those data. By doing so, it enables non-experts within the communities to conduct reliable Mental Health risk triage.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This work deals with the development of calibration procedures and control systems to improve the performance and efficiency of modern spark ignition turbocharged engines. The algorithms developed are used to optimize and manage the spark advance and the air-to-fuel ratio to control the knock and the exhaust gas temperature at the turbine inlet. The described work falls within the activity that the research group started in the previous years with the industrial partner Ferrari S.p.a. . The first chapter deals with the development of a control-oriented engine simulator based on a neural network approach, with which the main combustion indexes can be simulated. The second chapter deals with the development of a procedure to calibrate offline the spark advance and the air-to-fuel ratio to run the engine under knock-limited conditions and with the maximum admissible exhaust gas temperature at the turbine inlet. This procedure is then converted into a model-based control system and validated with a Software in the Loop approach using the engine simulator developed in the first chapter. Finally, it is implemented in a rapid control prototyping hardware to manage the combustion in steady-state and transient operating conditions at the test bench. The third chapter deals with the study of an innovative and cheap sensor for the in-cylinder pressure measurement, which is a piezoelectric washer that can be installed between the spark plug and the engine head. The signal generated by this kind of sensor is studied, developing a specific algorithm to adjust the value of the knock index in real-time. Finally, with the engine simulator developed in the first chapter, it is demonstrated that the innovative sensor can be coupled with the control system described in the second chapter and that the performance obtained could be the same reachable with the standard in-cylinder pressure sensors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the industry of steelmaking, the process of galvanizing is a treatment which is applied to protect the steel from corrosion. The air knife effect (AKE) occurs when nozzles emit a steam of air on the surfaces of a steel strip to remove excess zinc from it. In our work we formalized the problem to control the AKE and we implemented, with the R&D dept.of MarcegagliaSPA, a DL model able to drive the AKE. We call it controller. It takes as input the tuple : a tuple of the physical conditions of the process line (t,h,s) with the target value of the zinc coating (c); and generates the expected tuple of (pres and dist) to drive the mechanical nozzles towards the (c). According to the requirements we designed the structure of the network. We collected and explored the data set of the historical data of the smart factory. Finally, we designed the loss function as sum of three components: the minimization between the coating addressed by the network and the target value we want to reach; and two weighted minimization components for both pressure and distance. In our solution we construct a second module, named coating net, to predict the coating of zinc resulting from the AKE when the conditions are applied to the prod. line. Its structure is made by a linear and a deep nonlinear “residual” component learned by empirical observations. The predictions made by the coating nets are used as ground truth in the loss function of the controller. By tuning the weights of the different components of the loss function, it is possible to train models with slightly different optimization purposes. In the tests we compared the regularization of different strategies with the standard one in condition of optimal estimation for both; the overall accuracy is ± 3 g/m^2 dal target for all of them. Lastly, we analyze how the controller modeled the current solutions with the new logic: the sub-optimal values of pres and dist can be optimize of 50% and 20%.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This report describes the realization of a system, in which an object detection model will be implemented, whose aim is to detect the presence of people in images. This system could be used for several applications: for example, it could be carried on board an aircraft or a drone. In this case, the system is designed in such a way that it can be mounted on light/medium weight helicopters, helping the operator to find people in emergency situations. In the first chapter the use of helicopters for civil protection is analysed and applications similar to this case study are listed. The second chapter describes the choice of the hardware devices that have been used to implement a prototype of a system to collect, analyse and display images. At first, the PC necessary to process the images was chosen, based on the characteristics of the algorithms that are necessary to run the analysis. In the further, a camera that could be compatible with the PC was selected. Finally, the battery pack was chosen taking into account the electrical consumption of the devices. The third chapter illustrates the algorithms used for image analysis. In the fourth, some of the requirements listed in the regulations that must be taken into account for carrying on board all the devices have been briefly analysed. In the fifth chapter the activity of design and modelling, with the CAD Solidworks, the devices and a prototype of a case that will house them is described. The sixth chapter discusses the additive manufacturing, since the case was printed exploiting this technology. In the seventh chapter, part of the tests that must be carried out on the equipment to certificate it have been analysed, and some simulations have been carried out. In the eighth chapter the results obtained once loaded the object detection model on a hardware for image analyses were showed. In the ninth chapter, conclusions and future applications were discussed.