956 resultados para Process Development
Resumo:
Our research asked the following main questions: how the characteristics of professionals service firms allow them to successfully innovate in exploiting through exploring by combining internal and external factors of innovation and how these ambidextrous organisations perceive these factors; and how do successful innovators in professional service firms use corporate entrepreneurship models in their new service development processes? With a goal to shed light on innovation in professional knowledge intensive business service firms’ (PKIBS), we concluded a qualitative analysis of ten globally acting law firms, providing business legal services. We analyse the internal and factors of innovation that are critical for PKIBS’ innovation. We suggest how these firms become ambidextrous in changing environment. Our findings show that this kind of firms has particular type of ambidexterity due to their specific characteristics. As PKIBS are very dependant on its human capital, governance structure, and the high expectations of their clients, their ambidexterity is structural, but also contextual at the same time. In addition, we suggest 3 types of corporate entrepreneurship models that international PKIBS use to enhance innovation in turbulent environments. We looked at how law firms going through turbulent environments were using corporate entrepreneurship activities as a part of their strategies to be more innovative. Using visual mapping methodology, we developed three types of innovation patterns in the law firms. We suggest that corporate entrepreneurship models depend on successful application of mainly three elements: who participates in corporate entrepreneurship initiatives; what are the formal processes that enhances these initiatives; and what are the policies applied to this type of behaviour.
Resumo:
Based on the candidature of a region in the Swiss Alps as a World Natural Heritage Site (WHS), this article outlines the negotiation process as reflected in the local media. Discussions of the World Heritage issue over a time span of 4 years revealed how the region concerned was discursively constructed and that discursive constructions implied specific views of nature. By elaborating on these conflicting views of nature, we intend to reflect on the implicit meanings that influenced and structured the debate about the WHS and more generally the issues of sustainable regional development. The results show a broadening of the debate from a rather fragmented toward a more inclusive view of nature, which relates to basic assumptions of the global discourse on sustainable development. Additionally, a view of nature as inherited from past generations extended the WHS discussion and thus gave a new dimension to the concept of sustainability.
Resumo:
Model based calibration has gained popularity in recent years as a method to optimize increasingly complex engine systems. However virtually all model based techniques are applied to steady state calibration. Transient calibration is by and large an emerging technology. An important piece of any transient calibration process is the ability to constrain the optimizer to treat the problem as a dynamic one and not as a quasi-static process. The optimized air-handling parameters corresponding to any instant of time must be achievable in a transient sense; this in turn depends on the trajectory of the same parameters over previous time instances. In this work dynamic constraint models have been proposed to translate commanded to actually achieved air-handling parameters. These models enable the optimization to be realistic in a transient sense. The air handling system has been treated as a linear second order system with PD control. Parameters for this second order system have been extracted from real transient data. The model has been shown to be the best choice relative to a list of appropriate candidates such as neural networks and first order models. The selected second order model was used in conjunction with transient emission models to predict emissions over the FTP cycle. It has been shown that emission predictions based on air-handing parameters predicted by the dynamic constraint model do not differ significantly from corresponding emissions based on measured air-handling parameters.
Resumo:
This is the first part of a study investigating a model-based transient calibration process for diesel engines. The motivation is to populate hundreds of parameters (which can be calibrated) in a methodical and optimum manner by using model-based optimization in conjunction with the manual process so that, relative to the manual process used by itself, a significant improvement in transient emissions and fuel consumption and a sizable reduction in calibration time and test cell requirements is achieved. Empirical transient modelling and optimization has been addressed in the second part of this work, while the required data for model training and generalization are the focus of the current work. Transient and steady-state data from a turbocharged multicylinder diesel engine have been examined from a model training perspective. A single-cylinder engine with external air-handling has been used to expand the steady-state data to encompass transient parameter space. Based on comparative model performance and differences in the non-parametric space, primarily driven by a high engine difference between exhaust and intake manifold pressures (ΔP) during transients, it has been recommended that transient emission models should be trained with transient training data. It has been shown that electronic control module (ECM) estimates of transient charge flow and the exhaust gas recirculation (EGR) fraction cannot be accurate at the high engine ΔP frequently encountered during transient operation, and that such estimates do not account for cylinder-to-cylinder variation. The effects of high engine ΔP must therefore be incorporated empirically by using transient data generated from a spectrum of transient calibrations. Specific recommendations on how to choose such calibrations, how many data to acquire, and how to specify transient segments for data acquisition have been made. Methods to process transient data to account for transport delays and sensor lags have been developed. The processed data have then been visualized using statistical means to understand transient emission formation. Two modes of transient opacity formation have been observed and described. The first mode is driven by high engine ΔP and low fresh air flowrates, while the second mode is driven by high engine ΔP and high EGR flowrates. The EGR fraction is inaccurately estimated at both modes, while EGR distribution has been shown to be present but unaccounted for by the ECM. The two modes and associated phenomena are essential to understanding why transient emission models are calibration dependent and furthermore how to choose training data that will result in good model generalization.
Resumo:
This is the second part of a study investigating a model-based transient calibration process for diesel engines. The first part addressed the data requirements and data processing required for empirical transient emission and torque models. The current work focuses on modelling and optimization. The unexpected result of this investigation is that when trained on transient data, simple regression models perform better than more powerful methods such as neural networks or localized regression. This result has been attributed to extrapolation over data that have estimated rather than measured transient air-handling parameters. The challenges of detecting and preventing extrapolation using statistical methods that work well with steady-state data have been explained. The concept of constraining the distribution of statistical leverage relative to the distribution of the starting solution to prevent extrapolation during the optimization process has been proposed and demonstrated. Separate from the issue of extrapolation is preventing the search from being quasi-static. Second-order linear dynamic constraint models have been proposed to prevent the search from returning solutions that are feasible if each point were run at steady state, but which are unrealistic in a transient sense. Dynamic constraint models translate commanded parameters to actually achieved parameters that then feed into the transient emission and torque models. Combined model inaccuracies have been used to adjust the optimized solutions. To frame the optimization problem within reasonable dimensionality, the coefficients of commanded surfaces that approximate engine tables are adjusted during search iterations, each of which involves simulating the entire transient cycle. The resulting strategy, different from the corresponding manual calibration strategy and resulting in lower emissions and efficiency, is intended to improve rather than replace the manual calibration process.
Resumo:
Development of novel implants in orthopaedic trauma surgery is based on limited datasets of cadaver trials or artificial bone models. A method has been developed whereby implants can be constructed in an evidence based method founded on a large anatomic database consisting of more than 2.000 datasets of bones extracted from CT scans. The aim of this study was the development and clinical application of an anatomically pre-contoured plate for the treatment of distal fibular fractures based on the anatomical database. 48 Caucasian and Asian bone models (left and right) from the database were used for the preliminary optimization process and validation of the fibula plate. The implant was constructed to fit bilaterally in a lateral position of the fibula. Then a biomechanical comparison of the designed implant to the current gold standard in the treatment of distal fibular fractures (locking 1/3 tubular plate) was conducted. Finally, a clinical surveillance study to evaluate the grade of implant fit achieved was performed. The results showed that with a virtual anatomic database it was possible to design a fibula plate with an optimized fit for a large proportion of the population. Biomechanical testing showed the novel fibula plate to be superior to 1/3 tubular plates in 4-point bending tests. The clinical application showed a very high degree of primary implant fit. Only in a small minority of cases further intra-operative implant bending was necessary. Therefore, the goal to develop an implant for the treatment of distal fibular fractures based on the evidence of a large anatomical database could be attained. Biomechanical testing showed good results regarding the stability and the clinical application confirmed the high grade of anatomical fit.
Resumo:
OBJECTIVE: During postnatal development, mammalian articular cartilage acts as a surface growth plate for the underlying epiphyseal bone. Concomitantly, it undergoes a fundamental process of structural reorganization from an immature isotropic to a mature (adult) anisotropic architecture. However, the mechanism underlying this structural transformation is unknown. It could involve either an internal remodelling process, or complete resorption followed by tissue neoformation. The aim of this study was to establish which of these two alternative tissue reorganization mechanisms is physiologically operative. We also wished to pinpoint the articular cartilage source of the stem cells for clonal expansion and the zonal location of the chondrocyte pool with high proliferative activity. METHODS: The New Zealand white rabbit served as our animal model. The analysis was confined to the high-weight-bearing (central) areas of the medial and lateral femoral condyles. After birth, the articular cartilage layer was evaluated morphologically at monthly intervals from the first to the eighth postnatal month, when this species attains skeletal maturity. The overall height of the articular cartilage layer at each juncture was measured. The growth performance of the articular cartilage layer was assessed by calcein labelling, which permitted an estimation of the daily growth rate of the epiphyseal bone and its monthly length-gain. The slowly proliferating stem-cell pool was identified immunohistochemically (after labelling with bromodeoxyuridine), and the rapidly proliferating chondrocyte population by autoradiography (after labelling with (3)H-thymidine). RESULTS: The growth activity of the articular cartilage layer was highest 1 month after birth. It declined precipitously between the first and third months, and ceased between the third and fourth months, when the animal enters puberty. The structural maturation of the articular cartilage layer followed a corresponding temporal trend. During the first 3 months, when the articular cartilage layer is undergoing structural reorganization, the net length-gain in the epiphyseal bone exceeded the height of the articular cartilage layer. This finding indicates that the postnatal reorganization of articular cartilage from an immature isotropic to a mature anisotropic structure is not achieved by a process of internal remodelling, but by the resorption and neoformation of all zones except the most superficial (stem-cell) one. The superficial zone was found to consist of slowly dividing stem cells with bidirectional mitotic activity. In the horizontal direction, this zone furnishes new stem cells that replenish the pool and effect a lateral expansion of the articular cartilage layer. In the vertical direction, the superficial zone supplies the rapidly dividing, transit-amplifying daughter-cell pool that feeds the transitional and upper radial zones during the postnatal growth phase of the articular cartilage layer. CONCLUSIONS: During postnatal development, mammalian articular cartilage fulfils a dual function, viz., it acts not only as an articulating layer but also as a surface growth plate. In the lapine model, this growth activity ceases at puberty (3-4 months of age), whereas that of the true (metaphyseal) growth plate continues until the time of skeletal maturity (8 months). Hence, the two structures are regulated independently. The structural maturation of the articular cartilage layer coincides temporally with the cessation of its growth activity - for the radial expansion and remodelling of the epiphyseal bone - and with sexual maturation. That articular cartilage is physiologically reorganized by a process of tissue resorption and neoformation, rather than by one of internal remodelling, has important implications for the functional engineering and repair of articular cartilage tissue.
Resumo:
This paper provides an insight to the development of a process model for the essential expansion of the automatic miniload warehouse. The model is based on the literature research and covers four phases of a warehouse expansion: the preparatory phase, the current state analysis, the design phase and the decision making phase. In addition to the literature research, the presented model is based on a reliable data set and can be applicable with a reasonable effort to ensure the informed decision on the warehouse layout. The model is addressed to users who are usually employees of logistics department, and is oriented on the improvement of the daily business organization combined with the warehouse expansion planning.
Resumo:
Simulation techniques are almost indispensable in the analysis of complex systems. Materials- and related information flow processes in logistics often possess such complexity. Further problem arise as the processes change over time and pose a Big Data problem as well. To cope with these issues adaptive simulations are more and more frequently used. This paper presents a few relevant advanced simulation models and intro-duces a novel model structure, which unifies modelling of geometrical relations and time processes. This way the process structure and their geometric relations can be handled in a well understandable and transparent way. Capabilities and applicability of the model is also presented via a demonstrational example.
Resumo:
Current models of embryological development focus on intracellular processes such as gene expression and protein networks, rather than on the complex relationship between subcellular processes and the collective cellular organization these processes support. We have explored this collective behavior in the context of neocortical development, by modeling the expansion of a small number of progenitor cells into a laminated cortex with layer and cell type specific projections. The developmental process is steered by a formal language analogous to genomic instructions, and takes place in a physically realistic three-dimensional environment. A common genome inserted into individual cells control their individual behaviors, and thereby gives rise to collective developmental sequences in a biologically plausible manner. The simulation begins with a single progenitor cell containing the artificial genome. This progenitor then gives rise through a lineage of offspring to distinct populations of neuronal precursors that migrate to form the cortical laminae. The precursors differentiate by extending dendrites and axons, which reproduce the experimentally determined branching patterns of a number of different neuronal cell types observed in the cat visual cortex. This result is the first comprehensive demonstration of the principles of self-construction whereby the cortical architecture develops. In addition, our model makes several testable predictions concerning cell migration and branching mechanisms.
Resumo:
The author perceives endogenous development as a social learning process, which is constructed by all actors involved. To enhance social learning, a methodology called Autodidactic Learning for sustainability is used, in which the perception of both local actors and external actors are highlighted. Reflecting on differences, conflicts and common interests leads to highly motivated debate and shared reflection, which is almost identical with social learning, and flattens the usual hierarchy between local and external actors. The article shows that the energies generated through collective learning can trigger important technical, social and political changes, which take into account the multiple dimensions of local reality.
Resumo:
The policy development process leading to the Labour government's white paper of December 1997—The new NHS: Modern, Dependable—is the focus of this project and the public policy development literature is used to aid in the understanding of this process. Policy makers who had been involved in the development of the white paper were interviewed in order to acquire a thorough understanding of who was involved in this process and how they produced the white paper. A theoretical framework is used that sorts policy development models into those that focus on knowledge and experience, and those which focus on politics and influence. This framework is central to understanding the evidence gathered from the individuals and associations that participated in this policy development process. The main research question to be asked in this project is to what extent do either of these sets of policy development models aid in understanding and explicating the process by which the Labour government's policies were developed. The interview evidence, along with published evidence, show that a clear pattern of policy change emerged from this policy development process, and the Knowledge-Experience and Politics-Influence policy making models both assist in understanding this process. The early stages of the policy development process were characterized as hierarchical and iterative, yet also very collaborative among those participating, with knowledge and experience being quite prevalent. At every point in the process, however, informal networks of political influence were used and noted to be quite prevalent by all of the individuals interviewed. The later stages of the process then became increasingly noninclusive, with decisions made by a select group of internal and external policy makers. These policy making models became an important tool with which to understand the policy development process. This Knowledge-Experience and Politics-Influence dichotomy of policy development models could therefore be useful in analyzing other types of policy development. ^