988 resultados para parallel implementation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the market where companies similar in size and resources are competing, it is challenging to have any advantage over others. In order to stay afloat company needs to have capability to perform with fewer resources and yet provide better service. Hence development of efficient processes which can cut costs and improve performance is crucial. As business expands, processes become complicated and large amount of data needs to be managed and available on request. Different tools are used in companies to store and manage data, which facilitates better production and transactions. In the modern business world the most utilized tool for that purpose is ERP - Enterprise Resource Planning system. The focus of this research is to study how competitive advantage can be achieved by implementing proprietary ERP system in the company; ERP system that is in-house created, tailor made to match and align business needs and processes. Market is full of ERP software, but choosing the right one is a big challenge. Identifying the key features that need improvement in processes and data management, choosing the right ERP, implementing it and the follow-up is a long and expensive journey companies undergo. Some companies prefer to invest in a ready-made package bought from vendor and adjust it according to own business needs, while others focus on creating own system with in-house IT capabilities. In this research a case company is used and author tries to identify and analyze why organization in question decided to pursue the development of proprietary ERP system, how it has been implemented and whether it has been successful. Main conclusion and recommendation of this research is for companies to know core capabilities and constraints before choosing and implementing ERP system. Knowledge of factors that affect system change outcome is important, to make the right decisions on strategic level and implement on operational level. Duration of the project in the case company has lasted longer than anticipated. It has been reported that in cases of buying ready product from vendor, projects are delayed and completed over budget as well. In general, in case company implementation of proprietary ERP has been successful both from business performance figures and usability of system by employees. In terms of future research, conducting a study to calculate statistically ROI of both approaches; of buying ready product and creating own ERP will be beneficial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parallel-connected photovoltaic inverters are required in large solar plants where it is not economically or technically reasonable to use a single inverter. Currently, parallel inverters require individual isolating transformers to cut the path for the circulating current. In this doctoral dissertation, the problem is approached by attempting to minimize the generated circulating current. The circulating current is a function of the generated common-mode voltages of the parallel inverters and can be minimized by synchronizing the inverters. The synchronization has previously been achieved by a communication link. However, in photovoltaic systems the inverters may be located far apart from each other. Thus, a control free of communication is desired. It is shown in this doctoral dissertation that the circulating current can also be obtained by a common-mode voltage measurement. A control method based on a short-time switching frequency transition is developed and tested with an actual photovoltaic environment of two parallel inverters connected to two 5 kW solar arrays. Controls based on the measurement of the circulating current and the common-mode voltage are generated and tested. A communication-free method of controlling the circulating current between parallelconnected inverters is developed and verified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dissertation proposes two control strategies, which include the trajectory planning and vibration suppression, for a kinematic redundant serial-parallel robot machine, with the aim of attaining the satisfactory machining performance. For a given prescribed trajectory of the robot's end-effector in the Cartesian space, a set of trajectories in the robot's joint space are generated based on the best stiffness performance of the robot along the prescribed trajectory. To construct the required system-wide analytical stiffness model for the serial-parallel robot machine, a variant of the virtual joint method (VJM) is proposed in the dissertation. The modified method is an evolution of Gosselin's lumped model that can account for the deformations of a flexible link in more directions. The effectiveness of this VJM variant is validated by comparing the computed stiffness results of a flexible link with the those of a matrix structural analysis (MSA) method. The comparison shows that the numerical results from both methods on an individual flexible beam are almost identical, which, in some sense, provides mutual validation. The most prominent advantage of the presented VJM variant compared with the MSA method is that it can be applied in a flexible structure system with complicated kinematics formed in terms of flexible serial links and joints. Moreover, by combining the VJM variant and the virtual work principle, a systemwide analytical stiffness model can be easily obtained for mechanisms with both serial kinematics and parallel kinematics. In the dissertation, a system-wide stiffness model of a kinematic redundant serial-parallel robot machine is constructed based on integration of the VJM variant and the virtual work principle. Numerical results of its stiffness performance are reported. For a kinematic redundant robot, to generate a set of feasible joints' trajectories for a prescribed trajectory of its end-effector, its system-wide stiffness performance is taken as the constraint in the joints trajectory planning in the dissertation. For a prescribed location of the end-effector, the robot permits an infinite number of inverse solutions, which consequently yields infinite kinds of stiffness performance. Therefore, a differential evolution (DE) algorithm in which the positions of redundant joints in the kinematics are taken as input variables was employed to search for the best stiffness performance of the robot. Numerical results of the generated joint trajectories are given for a kinematic redundant serial-parallel robot machine, IWR (Intersector Welding/Cutting Robot), when a particular trajectory of its end-effector has been prescribed. The numerical results show that the joint trajectories generated based on the stiffness optimization are feasible for realization in the control system since they are acceptably smooth. The results imply that the stiffness performance of the robot machine deviates smoothly with respect to the kinematic configuration in the adjacent domain of its best stiffness performance. To suppress the vibration of the robot machine due to varying cutting force during the machining process, this dissertation proposed a feedforward control strategy, which is constructed based on the derived inverse dynamics model of target system. The effectiveness of applying such a feedforward control in the vibration suppression has been validated in a parallel manipulator in the software environment. The experimental study of such a feedforward control has also been included in the dissertation. The difficulties of modelling the actual system due to the unknown components in its dynamics is noticed. As a solution, a back propagation (BP) neural network is proposed for identification of the unknown components of the dynamics model of the target system. To train such a BP neural network, a modified Levenberg-Marquardt algorithm that can utilize an experimental input-output data set of the entire dynamic system is introduced in the dissertation. Validation of the BP neural network and the modified Levenberg- Marquardt algorithm is done, respectively, by a sinusoidal output approximation, a second order system parameters estimation, and a friction model estimation of a parallel manipulator, which represent three different application aspects of this method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

By recent years the phenomenon called crowdsourcing has been acknowledged as an innovative form of value creation that must be taken seriously. Crowdsourcing can be defined as an act of outsourcing tasks originally performed inside an organization, or assigned externally in form of a business relationship, to an undefinably large, heterogeneous mass of potential actors. This thesis constructs a framework for successful implementation of crowdsourcing initiatives. Firms that rely entirely on their own research and ideas cannot compete with the innovative capacity that crowd-powered firms have. Nowadays, crowdsourcing has become one of the key capabilities of businesses due to its innovative capabilities, in addition to the existing internal resources of the firm. By utilizing crowdsourcing the business gains access to an enormous pool of competence and knowledge. However, various risks remain such as uncertainty of crowd structure and loss of internal know-how. Crowdsourcing Success Framework introduces a step by step model for implementing crowdsourcing into the everyday operations of the business. It starts from the decision to utilize crowdsourcing and continues further into planning, organizing and execution. Finally, this thesis presents the success factors of crowdsourcing initiative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of the study was to form a strategic process model and project management tool to help IFRS change implementation projects in the future. These research results were designed based on the theoretical framework of Total Quality Management and leaning on the facts that were collected during the empirical case study of IAS 17 change. The us-age of the process oriented approach in IFRS standard change implementation after the initial IFRS implementation is rationalized with the following arguments: 1) well designed process tools lead to optimization of resources 2) With the help of process stages and related tasks it is easy to ensure the efficient way of working and managing the project as well as make sure to include all necessary stakeholders to the change process. This research is following the qualitative approach and the analysis is in describing format. The first part of the study is a literature review and the latter part has been conducted as a case study. The data has been col-lected in the case company with interviews and observation. The main findings are a process model for IFRS standard change process and a check-list formatted management tool for up-coming IFRS standard change projects. The process flow follows the main cornerstones in IASB’s standard setting process and the management tool has been divided to stages accordingly.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a software called pp-Blast that uses the publicly available Blast package and PVM (parallel virtual machine) to partition a multi-sequence query across a set of nodes with replicated or shared databases. Benchmark tests show that pp-Blast running in a cluster of 14 PCs outperformed conventional Blast running in large servers. In addition, using pp-Blast and the cluster we were able to map all human cDNAs onto the draft of the human genome in less than 6 days. We propose here that the cost/benefit ratio of pp-Blast makes it appropriate for large-scale sequence analysis. The source code and configuration files for pp-Blast are available at http://www.ludwig.org.br/biocomp/tools/pp-blast.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents a novel design paradigm, called Virtual Runtime Application Partitions (VRAP), to judiciously utilize the on-chip resources. As the dark silicon era approaches, where the power considerations will allow only a fraction chip to be powered on, judicious resource management will become a key consideration in future designs. Most of the works on resource management treat only the physical components (i.e. computation, communication, and memory blocks) as resources and manipulate the component to application mapping to optimize various parameters (e.g. energy efficiency). To further enhance the optimization potential, in addition to the physical resources we propose to manipulate abstract resources (i.e. voltage/frequency operating point, the fault-tolerance strength, the degree of parallelism, and the configuration architecture). The proposed framework (i.e. VRAP) encapsulates methods, algorithms, and hardware blocks to provide each application with the abstract resources tailored to its needs. To test the efficacy of this concept, we have developed three distinct self adaptive environments: (i) Private Operating Environment (POE), (ii) Private Reliability Environment (PRE), and (iii) Private Configuration Environment (PCE) that collectively ensure that each application meets its deadlines using minimal platform resources. In this work several novel architectural enhancements, algorithms and policies are presented to realize the virtual runtime application partitions efficiently. Considering the future design trends, we have chosen Coarse Grained Reconfigurable Architectures (CGRAs) and Network on Chips (NoCs) to test the feasibility of our approach. Specifically, we have chosen Dynamically Reconfigurable Resource Array (DRRA) and McNoC as the representative CGRA and NoC platforms. The proposed techniques are compared and evaluated using a variety of quantitative experiments. Synthesis and simulation results demonstrate VRAP significantly enhances the energy and power efficiency compared to state of the art.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental threats are growing nowadays, they became global issues. People around the world try to face these issues by two means: solving the current affected environs and preventing non-affected environs. This thesis describes the design, implementation, and evaluation of online water quality monitoring system in Lake Saimaa, Finland. The water quality in Lake Saimaa needs to be monitored in order to provide responsible bodies with valuable information which allows them to act fast in order to prevent any negative impact on the lake's environment. The objectives were to design a suitable system, implement the system in Lake Saimaa, and then to evaluate the applicability and reliability of such systems for this environment. The needs for the system were first isolated, and then the design, needed modifications, and the construction of the system took place. After that was the testing of the system in Lake Saimaa in two locations nearby Mikkeli city. The last step was to evaluate the whole system. The main results were that the application of online water quality monitoring systems in Lake Saimaa can benefit of many advantages such as reducing the required manpower, time and running costs. However, the point of unreliability of the exact measured values of some parameters is still the drawback of such systems which can be developed by using more advanced equipments with more sophisticated features specifically for the purpose of monitoring in the predefined location.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of the study is to evaluate the impact of Lean Innovation management philosophy on the creativity potential of the large multinational enterprise. A theory of Lean Innovation indicates that the modern company in any industry can successfully combine both waste-decreasing approach and innovative potential promotion through creativity cultivation or, at least, preservation. The theoretical part of the work covers the main factors, pros and cons of Lean thinking and Innovation management separately, along with generalized new product development overview. While the modern international market becomes more accessible for entrepreneural initiatives, small enterprises and start-ups, large international corporations are more subject to adopt the Lean Innovation approach in both operational and product development sectors due to extended resources and capabilities. Moreover, a multinational enterprise is a highly probable pioneer in Lean innovation implementation. The empirical part of the thesis refers to a case of large European enterprise, operating in many markets around the globe, that currently undergoes innovation management adjustments and implementations in product development while already have related themselves with operational process optimization process through Lean thinking. A goal of the work is to understand what kind of difficulties and consequences a large international firm faces when dealing with Lean Innovation to improve own performance, if they can be sealed for generalized approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Feature extraction is the part of pattern recognition, where the sensor data is transformed into a more suitable form for the machine to interpret. The purpose of this step is also to reduce the amount of information passed to the next stages of the system, and to preserve the essential information in the view of discriminating the data into different classes. For instance, in the case of image analysis the actual image intensities are vulnerable to various environmental effects, such as lighting changes and the feature extraction can be used as means for detecting features, which are invariant to certain types of illumination changes. Finally, classification tries to make decisions based on the previously transformed data. The main focus of this thesis is on developing new methods for the embedded feature extraction based on local non-parametric image descriptors. Also, feature analysis is carried out for the selected image features. Low-level Local Binary Pattern (LBP) based features are in a main role in the analysis. In the embedded domain, the pattern recognition system must usually meet strict performance constraints, such as high speed, compact size and low power consumption. The characteristics of the final system can be seen as a trade-off between these metrics, which is largely affected by the decisions made during the implementation phase. The implementation alternatives of the LBP based feature extraction are explored in the embedded domain in the context of focal-plane vision processors. In particular, the thesis demonstrates the LBP extraction with MIPA4k massively parallel focal-plane processor IC. Also higher level processing is incorporated to this framework, by means of a framework for implementing a single chip face recognition system. Furthermore, a new method for determining optical flow based on LBPs, designed in particular to the embedded domain is presented. Inspired by some of the principles observed through the feature analysis of the Local Binary Patterns, an extension to the well known non-parametric rank transform is proposed, and its performance is evaluated in face recognition experiments with a standard dataset. Finally, an a priori model where the LBPs are seen as combinations of n-tuples is also presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this study was to increase understanding of the link between the identification of required HR competences and competence management alignment with business strategy in a Finnish, global company employing over 8,000 people and about 100 HR professionals. This aim was approached by analyzing the data collected in focus group interviews using a grounded theory method and in parallel reviewing the literature of strategic human resource management, competence-based strategic management, strategy and foresight. The literature on competence management in different contexts dismisses in-depth discussions on the foresight process and individuals are often forgotten in strategic frameworks. However, corporate foresight helps in the detection of emerging opportunities for innovations and in the implementation of strategy. The empirical findings indicate a lack of strategic leadership and an alignment with HR and business. Accordingly, the most important HR competence areas identified were the need for increasing business understanding and enabling change. As a result, the study provided a holistic model for competence foresight, which introduces HR professionals as strategic change agents in the role of organizational futurists at the heart of the company: facilitating competence foresight and competence development on individual as well as organizational levels, resulting in an agile organization with increased business understanding, sensitive sensors and adaptive actions to enable change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Myocardial ischemia, as well as the induction agents used in anesthesia, may cause corrected QT interval (QTc) prolongation. The objective of this randomized, double-blind trial was to determine the effects of high- vs conventional-dose bolus rocuronium on QTc duration and the incidence of dysrhythmias following anesthesia induction and intubation. Fifty patients about to undergo coronary artery surgery were randomly allocated to receive conventional-dose (0.6 mg/kg, group C, n=25) or high-dose (1.2 mg/kg, group H, n=25) rocuronium after induction with etomidate and fentanyl. QTc, heart rate, and mean arterial pressure were recorded before induction (T0), after induction (T1), after rocuronium (just before laryngoscopy; T2), 2 min after intubation (T3), and 5 min after intubation (T4). The occurrence of dysrhythmias was recorded. In both groups, QTc was significantly longer at T3 than at baseline [475 vs 429 ms in group C (P=0.001), and 459 vs 434 ms in group H (P=0.005)]. The incidence of dysrhythmias in group C (28%) and in group H (24%) was similar. The QTc after high-dose rocuronium was not significantly longer than after conventional-dose rocuronium in patients about to undergo coronary artery surgery who were induced with etomidate and fentanyl. In both groups, compared with baseline, QTc was most prolonged at 2 min after intubation, suggesting that QTc prolongation may be due to the nociceptive stimulus of intubation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neoadjuvant chemotherapy has practical and theoretical advantages over adjuvant chemotherapy strategy in breast cancer (BC) management. Moreover, metronomic delivery has a more favorable toxicity profile. The present study examined the feasibility of neoadjuvant metronomic chemotherapy in two cohorts [HER2+ (TraQme) and HER2− (TAME)] of locally advanced BC. Twenty patients were prospectively enrolled (TraQme, n=9; TAME, n=11). Both cohorts received weekly paclitaxel at 100 mg/m2 during 8 weeks followed by weekly doxorubicin at 24 mg/m2 for 9 weeks in combination with oral cyclophosphamide at 100 mg/day (fixed dose). The HER2+ cohort received weekly trastuzumab. The study was interrupted because of safety issues. Thirty-six percent of patients in the TAME cohort and all patients from the TraQme cohort had stage III BC. Of note, 33% from the TraQme cohort and 66% from the TAME cohort displayed hormone receptor positivity in tumor tissue. The pathological complete response rates were 55% and 18% among patients enrolled in the TraQme and TAME cohorts, respectively. Patients in the TraQme cohort had more advanced BC stages at diagnosis, higher-grade pathological classification, and more tumors lacking hormone receptor expression, compared to the TAME cohort. The toxicity profile was also different. Two patients in the TraQme cohort developed pneumonitis, and in the TAME cohort we observed more hematological toxicity and hand-foot syndrome. The neoadjuvant metronomic chemotherapy regimen evaluated in this trial was highly effective in achieving a tumor response, especially in the HER2+ cohort. Pneumonitis was a serious, unexpected adverse event observed in this group. Further larger and randomized trials are warranted to evaluate the association between metronomic chemotherapy and trastuzumab treatment.