940 resultados para practical epistemology analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of open innovation has recently gained widespread attention, and is particularly relevant now as many firms endeavouring to implement open innovation, face different sets of challenges associated with managing it. Prior research on open innovation has focused on the internal processes dealing with open innovation implementation and the organizational changes, already taking place or yet required in companies order to succeed in the global open innovation market. Despite the intensive research on open innovation, the question of what influences its adoption by companies in different contexts has not received much attention in studies. To fill this gap, this thesis contribute to the discussion on open innovation influencing factors by bringing in the perspective of environmental impacts, i.e. gathering data on possible sources of external influences, classifying them and testing their systemic impact through conceptual system dynamics simulation model. The insights from data collection and conceptualization in modelling are used to answer the question of how the external environment affects the adoption of open innovation. The thesis research is presented through five research papers reflecting the method triangulation based study (conducted at initial stage as case study, later as quantitative analysis and finally as system dynamics simulation). This multitude of methods was used to collect the possible external influence factors and to assess their impact (on positive/negative scale rather than numerical). The results obtained throughout the thesis research bring valuable insights into understanding of open innovation influencing factors inside a firm’s operating environment, point out the balance required in the system for successful open innovation performance and discover the existence of tipping point of open innovation success when driven by market dynamics and structures. The practical implications on how firms and policy-makers can leverage environment for their potential benefits are offered in the conclusions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In any decision making under uncertainties, the goal is mostly to minimize the expected cost. The minimization of cost under uncertainties is usually done by optimization. For simple models, the optimization can easily be done using deterministic methods.However, many models practically contain some complex and varying parameters that can not easily be taken into account using usual deterministic methods of optimization. Thus, it is very important to look for other methods that can be used to get insight into such models. MCMC method is one of the practical methods that can be used for optimization of stochastic models under uncertainty. This method is based on simulation that provides a general methodology which can be applied in nonlinear and non-Gaussian state models. MCMC method is very important for practical applications because it is a uni ed estimation procedure which simultaneously estimates both parameters and state variables. MCMC computes the distribution of the state variables and parameters of the given data measurements. MCMC method is faster in terms of computing time when compared to other optimization methods. This thesis discusses the use of Markov chain Monte Carlo (MCMC) methods for optimization of Stochastic models under uncertainties .The thesis begins with a short discussion about Bayesian Inference, MCMC and Stochastic optimization methods. Then an example is given of how MCMC can be applied for maximizing production at a minimum cost in a chemical reaction process. It is observed that this method performs better in optimizing the given cost function with a very high certainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the past decade, organizations worldwide have begun to widely adopt agile software development practices, which offer greater flexibility to frequently changing business requirements, better cost effectiveness due to minimization of waste, faster time-to-market, and closer collaboration between business and IT. At the same time, IT services are continuing to be increasingly outsourced to third parties providing the organizations with the ability to focus on their core capabilities as well as to take advantage of better demand scalability, access to specialized skills, and cost benefits. An output-based pricing model, where the customers pay directly for the functionality that was delivered rather than the effort spent, is quickly becoming a new trend in IT outsourcing allowing to transfer the risk away from the customer while at the same time offering much better incentives for the supplier to optimize processes and improve efficiency, and consequently producing a true win-win outcome. Despite the widespread adoption of both agile practices and output-based outsourcing, there is little formal research available on how the two can be effectively combined in practice. Moreover, little practical guidance exists on how companies can measure the performance of their agile projects, which are being delivered in an output-based outsourced environment. This research attempted to shed light on this issue by developing a practical project monitoring framework which may be readily applied by organizations to monitor the performance of agile projects in an output-based outsourcing context, thus taking advantage of the combined benefits of such an arrangement Modified from action research approach, this research was divided into two cycles, each consisting of the Identification, Analysis, Verification, and Conclusion phases. During Cycle 1, a list of six Key Performance Indicators (KPIs) was proposed and accepted by the professionals in the studied multinational organization, which formed the core of the proposed framework and answered the first research sub-question of what needs to be measured. In Cycle 2, a more in-depth analysis was provided for each of the suggested Key Performance Indicators including the techniques for capturing, calculating, and evaluating the information provided by each KPI. In the course of Cycle 2, the second research sub-question was answered, clarifying how the data for each KPI needed to be measured, interpreted, and acted upon. Consequently, after two incremental research cycles, the primary research question was answered describing the practical framework that may be used for monitoring the performance of agile IT projects delivered in an output-based outsourcing context. This framework was evaluated by the professionals within the context of the studied organization and received positive feedback across all four evaluation criteria set forth in this research, including the low overhead of data collection, high value of provided information, ease of understandability of the metric dashboard, and high generalizability of the proposed framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: It was to analyse the most critical areas in Obstetrics and to suggest measures to reduce or avoid the situations most often involved in these disputes. METHODS: Obstetrics cases submitted to the Medico-legal Council since the creation of the National Institute of Legal Medicine and Forensic Sciences in 2001 until 2011 were evaluated. A comprehensive characterization, determination of absolute/relative frequencies, hypothesis of a linear trend over the years and the association between each parameter was done. RESULTS: The analysis has shown no significantly linear trend. The most common reasons for disputes were perinatal asphyxia (50%), traumatic injuries of the newborn (24%), maternal sequelae (19%) and issues related to prenatal diagnosis and/or obstetric ultrasound (5.4%). Perinatal asphyxia showed no significantly linear trend (p=0.58) and was usually related to perinatal deaths or permanent neurologic sequelae in newborn children. Traumatic injuries of the newborn, mostly related to instrumented deliveries, shoulder dystocia or vaginal delivery in breech presentation, has shown a significantly increased linear trend (p<0.001), especially related to instrumented deliveries. The delay/absence of cesarean section was the clinical procedure questioned in a significantly higher number of cases of perinatal asphyxia (68.7%) and of traumatic lesions of the newborn due to instrumented deliveries (20.5%). CONCLUSION: It is important to improve and correct theoretical/practical daily clinical performance in these highlighted areas, in order to reduce or even avoid situations that could end up in medico-legal litigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ocelot (Leopardus pardalis) is included in list of wild felid species protected by CITES and is part of conservation strategies that necessarily involve the use of assisted reproduction techniques, which requires practical and minimally invasive techniques of high reproducibility that permit the study of animal reproductive physiology. The objective of this study was to compare and validate two commercial assays: ImmuChem Double Antibody Corticosterone 125I RIA from ICN Biomedicals, Costa Mesa, CA, USA; and Coat-a-Count Cortisol 125I RIA from DPC, Los Angeles, CA, USA, for assessment of fecal glucocorticoid metabolites in ocelots submitted to ACTH (adrenocorticotropic hormone) challenge. Fecal samples were collected from five ocelots kept at the Brazilian Center of Neotropical Felines, Associação Mata Ciliar, São Paulo, Brazil, and one of the animals was chosen as a negative control. The experiment was conducted over a period of 9 days. On day 0, a total dose of 100 IU ACTH was administered intramuscularly. Immediately after collection the samples were stored at 20C in labeled plastic bags. The hormone metabolites were subsequently extracted and assayed using the two commercial kits. Previously it was performed a trial with the DPC kit to check the best extraction method for hormones metabolites. Data were analyzed with the SAS program for Windows V8 and reported as means ± SEM. The Schwarzenberger extraction method was slightly better when compared with the Wasser extraction method (103,334.56 ± 19,010.37ng/g of wet feces and 59,223.61 ± 12,725.36ng/g of wet feces respectively; P=0,0657). The ICN kit detected an increase in glucocorticoid metabolite concentrations in a more reliable manner. Metabolite concentrations (ng/g wet feces) on day 0 and day 1 were 66,956.28 ± 36,786.93 and 92,991.19 ± 28,555.63 for the DPC kit, and 205,483.32 ± 83,811.32 and 814,578.75 ± 292,150.47 for the ICN kit, respectively. The limit of detection for the ICN kit was 7.7 ng/mL for 100% B/Bo (25ng/mL for 88%B/Bo) and for the DPC kit it was 0.2ug/dL for 90.95% B/Bo (1ug/dL for 81.27% B/Bo). In conclusion it was confirmed that the Schwarzenberger extraction method and the ICN kit are superior for extracting and measuring fecal glucocorticoid metabolites in ocelot fecal samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The theoretical research of the study concentrated on finding theoretical frameworks to optimize the amount of needed stock keeping units (SKUs) in manufacturing industry. The goal was to find ways for a company to acquire an optimal collection of stock keeping units needed for manufacturing needed amount of end products. The research follows constructive research approach leaning towards practical problem solving. In the empirical part of this study, a recipe search tool was developed to an existing database used in the target company. The purpose of the tools was to find all the recipes meeting the EUPS performance standard and put the recipes in a ranking order using the data available in the database. The ranking of the recipes was formed from the combination of the performance measures and price of the recipes. In addition, the tool researched what kind of paper SKUs were needed to manufacture the best performing recipes. The tool developed during this process meets the requirements. It eases and makes it much faster to search for all the recipes meeting the EUPS standard. Furthermore, many future development possibilities for the tool were discovered while writing the thesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this thesis is studying knowledge retention mechanisms used in cases of single experts’ leaving in the case company, analyzing the reason for the mechanisms choice and successfulness of knowledge retention process depending of that choice. The theoretical part discusses the origins of knowledge retention processes in the theoretical studies, the existing knowledge retention mechanisms and practical issues of their implementation. The empirical part of the study is designed as employees’ interview with later discussion of the findings. The empirical findings indicate the following reasons for knowledge retention mechanisms choice: type of knowledge retained, specialty of leaving experts and time and distance issues of a particular case. The following factors influenced the success of a retention process: choice of knowledge retention mechanisms, usage of combination of mechanisms and creation of knowledge retention plans. The results might be useful for those interested in factors influencing knowledge retention processes in cases of experts’ departure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article discusses three possible ways to derive time domain boundary integral representations for elastodynamics. This discussion points out possible difficulties found when using those formulations to deal with practical applications. The discussion points out recommendations to select the convenient integral representation to deal with elastodynamic problems and opens the possibility of deriving simplified schemes. The proper way to take into account initial conditions applied to the body is an interesting topict shown. It illustrates the main differences between the discussed boundary integral representation expressions, their singularities and possible numerical problems. The correct way to use collocation points outside the analyzed domain is carefully described. Some applications are shown at the end of the paper, in order to demonstrate the capabilities of the technique when properly used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a one-dimensional, semi-empirical dynamic model for the simulation and analysis of a calcium looping process for post-combustion CO2 capture. Reduction of greenhouse emissions from fossil fuel power production requires rapid actions including the development of efficient carbon capture and sequestration technologies. The development of new carbon capture technologies can be expedited by using modelling tools. Techno-economical evaluation of new capture processes can be done quickly and cost-effectively with computational models before building expensive pilot plants. Post-combustion calcium looping is a developing carbon capture process which utilizes fluidized bed technology with lime as a sorbent. The main objective of this work was to analyse the technological feasibility of the calcium looping process at different scales with a computational model. A one-dimensional dynamic model was applied to the calcium looping process, simulating the behaviour of the interconnected circulating fluidized bed reactors. The model incorporates fundamental mass and energy balance solvers to semi-empirical models describing solid behaviour in a circulating fluidized bed and chemical reactions occurring in the calcium loop. In addition, fluidized bed combustion, heat transfer and core-wall layer effects were modelled. The calcium looping model framework was successfully applied to a 30 kWth laboratory scale and a pilot scale unit 1.7 MWth and used to design a conceptual 250 MWth industrial scale unit. Valuable information was gathered from the behaviour of a small scale laboratory device. In addition, the interconnected behaviour of pilot plant reactors and the effect of solid fluidization on the thermal and carbon dioxide balances of the system were analysed. The scale-up study provided practical information on the thermal design of an industrial sized unit, selection of particle size and operability in different load scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Communications play a key role in modern smart grids. New functionalities that make the grids ‘smart’ require the communication network to function properly. Data transmission between intelligent electric devices (IEDs) in the rectifier and the customer-end inverters (CEIs) used for power conversion is also required in the smart grid concept of the low-voltage direct current (LVDC) distribution network. Smart grid applications, such as smart metering, demand side management (DSM), and grid protection applied with communications are all installed in the LVDC system. Thus, besides remote connection to the databases of the grid operators, a local communication network in the LVDC network is needed. One solution applied to implement the communication medium in power distribution grids is power line communication (PLC). There are power cables in the distribution grids, and hence, they may be applied as a communication channel for the distribution-level data. This doctoral thesis proposes an IP-based high-frequency (HF) band PLC data transmission concept for the LVDC network. A general method to implement the Ethernet-based PLC concept between the public distribution rectifier and the customerend inverters in the LVDC grid is introduced. Low-voltage cables are studied as the communication channel in the frequency band of 100 kHz–30 MHz. The communication channel characteristics and the noise in the channel are described. All individual components in the channel are presented in detail, and a channel model, comprising models for each channel component is developed and verified by measurements. The channel noise is also studied by measurements. Theoretical signalto- noise ratio (SNR) and channel capacity analyses and practical data transmission tests are carried out to evaluate the applicability of the PLC concept against the requirements set by the smart grid applications in the LVDC system. The main results concerning the applicability of the PLC concept and its limitations are presented, and suggestion for future research proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workshop at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interest towards working capital management increased among practitioners and researchers because the financial crisis of 2008 caused the deterioration of the general financial situation. The importance of managing working capital effectively increased dramatically during the financial crisis. On one hand, companies highlighted the importance of working capital management as part of short-term financial management to overcome funding difficulties. On the other hand, in academia, it has been highlighted the need to analyze working capital management from a wider perspective namely from the value chain perspective. Previously, academic articles mostly discussed working capital management from a company-centered perspective. The objective of this thesis was to put working capital management in a wider and more academic perspective and present case studies of the value chains of industries as instrumental in theoretical contributions and practical contributions as complementary to theoretical contributions and conclusions. The principal assumption of this thesis is that selffinancing of value chains can be established through effective working capital management. Thus, the thesis introduces the financial value chain analysis method which is employed in the empirical studies. The effectiveness of working capital management of the value chains is studied through the cycle time of working capital. The financial value chain analysis method employed in this study is designed for considering value chain level phenomena. This method provides a holistic picture of the value chain through financial figures. It extends the value chain analysis to the industry level. Working capital management is studied by the cash conversion cycle that measures the length (days) of time a company has funds tied up in working capital, starting from the payment of purchases to the supplier and ending when remittance of sales is received from the customers. The working capital management practices employed in the automotive, pulp and paper and information and communication technology industries have been studied in this research project. Additionally, the Finnish pharmaceutical industry is studied to obtain a deeper understanding of the working capital management of the value chain. The results indicate that the cycle time of working capital is constant in the value chain context over time. The cash conversion cycle of automotive, pulp and paper, and ICT industries are on average 70, 60 and 40 days, respectively. The difference is mainly a consequence of the different cycle time of inventories. The financial crisis of 2008 affected the working capital management of the industries similarly. Both the cycle time of accounts receivable and accounts payable increased between 2008 and 2009. The results suggest that the companies of the automotive, pulp and paper and ICT value chains were not able to self-finance. Results do not indicate the improvement of value chains position in regard to working capital management either. The findings suggest that companies operating in the Finnish pharmaceutical industry are interested in developing their own working capital management, but collaboration with the value chain partners is not considered interesting. Competition no longer occurs between individual companies, but between value chains. Therefore the financial value chain analysis method introduced in this thesis has the potential to support value chains in improving their competitiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Effective control and limiting of carbon dioxide (CO₂) emissions in energy production are major challenges of science today. Current research activities include the development of new low-cost carbon capture technologies, and among the proposed concepts, chemical combustion (CLC) and chemical looping with oxygen uncoupling (CLOU) have attracted significant attention allowing intrinsic separation of pure CO₂ from a hydrocarbon fuel combustion process with a comparatively small energy penalty. Both CLC and CLOU utilize the well-established fluidized bed technology, but several technical challenges need to be overcome in order to commercialize the processes. Therefore, development of proper modelling and simulation tools is essential for the design, optimization, and scale-up of chemical looping-based combustion systems. The main objective of this work was to analyze the technological feasibility of CLC and CLOU processes at different scales using a computational modelling approach. A onedimensional fluidized bed model frame was constructed and applied for simulations of CLC and CLOU systems consisting of interconnected fluidized bed reactors. The model is based on the conservation of mass and energy, and semi-empirical correlations are used to describe the hydrodynamics, chemical reactions, and transfer of heat in the reactors. Another objective was to evaluate the viability of chemical looping-based energy production, and a flow sheet model representing a CLC-integrated steam power plant was developed. The 1D model frame was succesfully validated based on the operation of a 150 kWth laboratory-sized CLC unit fed by methane. By following certain scale-up criteria, a conceptual design for a CLC reactor system at a pre-commercial scale of 100 MWth was created, after which the validated model was used to predict the performance of the system. As a result, further understanding of the parameters affecting the operation of a large-scale CLC process was acquired, which will be useful for the practical design work in the future. The integration of the reactor system and steam turbine cycle for power production was studied resulting in a suggested plant layout including a CLC boiler system, a simple heat recovery setup, and an integrated steam cycle with a three pressure level steam turbine. Possible operational regions of a CLOU reactor system fed by bituminous coal were determined via mass, energy, and exergy balance analysis. Finally, the 1D fluidized bed model was modified suitable for CLOU, and the performance of a hypothetical 500 MWth CLOU fuel reactor was evaluated by extensive case simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The absolute nodal coordinate formulation was originally developed for the analysis of structures undergoing large rotations and deformations. This dissertation proposes several enhancements to the absolute nodal coordinate formulation based finite beam and plate elements. The main scientific contribution of this thesis relies on the development of elements based on the absolute nodal coordinate formulation that do not suffer from commonly known numerical locking phenomena. These elements can be used in the future in a number of practical applications, for example, analysis of biomechanical soft tissues. This study presents several higher-order Euler–Bernoulli beam elements, a simple method to alleviate Poisson’s and transverse shear locking in gradient deficient plate elements, and a nearly locking free gradient deficient plate element. The absolute nodal coordinate formulation based gradient deficient plate elements developed in this dissertation describe most of the common numerical locking phenomena encountered in the formulation of a continuum mechanics based description of elastic energy. Thus, with these fairly straightforwardly formulated elements that are comprised only of the position and transverse direction gradient degrees of freedom, the pathologies and remedies for the numerical locking phenomena are presented in a clear and understandable manner. The analysis of the Euler–Bernoulli beam elements developed in this study show that the choice of higher gradient degrees of freedom as nodal degrees of freedom leads to a smoother strain field. This improves the rate of convergence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Almost every problem of design, planning and management in the technical and organizational systems has several conflicting goals or interests. Nowadays, multicriteria decision models represent a rapidly developing area of operation research. While solving practical optimization problems, it is necessary to take into account various kinds of uncertainty due to lack of data, inadequacy of mathematical models to real-time processes, calculation errors, etc. In practice, this uncertainty usually leads to undesirable outcomes where the solutions are very sensitive to any changes in the input parameters. An example is the investment managing. Stability analysis of multicriteria discrete optimization problems investigates how the found solutions behave in response to changes in the initial data (input parameters). This thesis is devoted to the stability analysis in the problem of selecting investment project portfolios, which are optimized by considering different types of risk and efficiency of the investment projects. The stability analysis is carried out in two approaches: qualitative and quantitative. The qualitative approach describes the behavior of solutions in conditions with small perturbations in the initial data. The stability of solutions is defined in terms of existence a neighborhood in the initial data space. Any perturbed problem from this neighborhood has stability with respect to the set of efficient solutions of the initial problem. The other approach in the stability analysis studies quantitative measures such as stability radius. This approach gives information about the limits of perturbations in the input parameters, which do not lead to changes in the set of efficient solutions. In present thesis several results were obtained including attainable bounds for the stability radii of Pareto optimal and lexicographically optimal portfolios of the investment problem with Savage's, Wald's criteria and criteria of extreme optimism. In addition, special classes of the problem when the stability radii are expressed by the formulae were indicated. Investigations were completed using different combinations of Chebyshev's, Manhattan and Hölder's metrics, which allowed monitoring input parameters perturbations differently.