948 resultados para Microtissue culture models
Resumo:
In the current business world which companies’ competition is very compact in the business arena, quality in manufacturing and providing products and services can be considered as a means of seeking excellence and success of companies in this competition arena. Entering the era of e-commerce and emergence of new production systems and new organizational structures, traditional management and quality assurance systems have been challenged. Consequently, quality information system has been gained a special seat as one of the new tools of quality management. In this paper, quality information system has been studied with a review of the literature of the quality information system, and the role and position of quality Information System (QIS) among other information systems of a organization is investigated. The quality Information system models are analyzed and by analyzing and assessing presented models in quality information system a conceptual and hierarchical model of quality information system is suggested and studied. As a case study the hierarchical model of quality information system is developed by evaluating hierarchical models presented in the field of quality information system based on the Shetabkar Co.
Resumo:
Fisheries and aquaculture are important for food security, income generation and are critical to long term sustainability of many countries. Freshwater prawns have been harvested in the streams and creeks in Vanuatu, however due to over-exploitation catches have declined in recent years. To satisfy high demand for this product, Vanuatu government intends to establish economically viable small-scale aquaculture industries. The current project showed that wild Macrobrachium lar in Vanuatu constitute a single population for management purposes and that M. rosenbergii grows much faster than M. lar in simple pond grow-out systems, hence is a better species for culture in Vanuatu.
Resumo:
This research used a multiple-case study approach to empirically investigate the complex relationship between factors influencing inter-project knowledge sharing—trustworthiness, organizational culture, and knowledge-sharing mechanisms. Adopting a competing values framework, we found evidence of patterns existing between the type of culture, on the project management unit level, and project managers’ perceptions of valuing trustworthy behaviors and the way they share knowledge, on the individual level. We also found evidence for mutually reinforcing the effect of trust and clan culture, which shape tacit knowledge-sharing behaviors.
Resumo:
Public sector organisations (PSOs) operate in information-intensive environments often within operational contexts where efficiency is a goal. What's more, the rapid adoption of IT is expected to facilitate good governance within public sector organisations but it often clashes with the bureaucratic culture of these organisations. Accordingly, models such as IT Governance (ITG) and government reform -in particular the new public management (NPM)- were introduced in PSOs in an effort to address the inefficiencies of bureaucracy and under performance. This work explores the potential effect of change in political direction and policy on the stability of IT governance in Australian public sector organisations. The aim of this paper is to examine implications of a change of government and the resulting political environment on the effectiveness of the audit function of ITG. The empirical data discussed here indicate that a number of aspects of audit functionality were negatively affected by change in political direction and resultant policy changes. The results indicate a perceived decline in capacity and capability which in turn disrupts the stability of IT governance systems in public sector organisations.
Resumo:
The previous chapters gave an insightful introduction into the various facets of Business Process Management. We now share a rich understanding of the essential ideas behind designing and managing processes for organizational purposes. We have also learned about the various streams of research and development that have influenced contemporary BPM. As a matter of fact, BPM has become a holistic management discipline. As such, it requires that a plethora of facets needs to be addressed for its successful und sustainable application. This chapter provides a framework that consolidates and structures the essential factors that constitute BPM as a whole. Drawing from research in the field of maturity models, we suggest six core elements of BPM: strategic alignment, governance, methods, information technology, people, and culture. These six elements serve as the structure for this BPM Handbook.
Resumo:
Modern copyright law is based on the inescapable assumption that users, given the choice, will free-ride rather than pay for access. In fact, many consumers of cultural works – music, books, films, games, and other works – fundamentally want to support their production. It turns out that humans are motivated to support cultural production not only by extrinsic incentives, but also by social norms of fairness and reciprocity. This article explains how producers across the creative industries have used this insight to develop increasingly sophisticated business models that rely on voluntary payments (including pay-what-you-want schemes) to fund their costs of production. The recognition that users are not always free-riders suggests that current policy approaches to copyright are fundamentally flawed. Because social norms are so important in consumer motivations, the perceived unfairness of the current copyright system undermines the willingness of people to pay for access to cultural goods. While recent copyright reform debate has focused on creating stronger deterrence through enforcement, increasing the perceived fairness and legitimacy of copyright law is likely to be much more effective. The fact that users will sometimes willingly support cultural production also challenges the economic raison d'être of copyright law. This article demonstrates how 'peaceful revolutions' are flipping conventional copyright models and encouraging free-riding through combining incentives and prosocial norms. Because they provide a means to support production without limiting the dissemination of knowledge and culture, there is good reason to believe that these commons-based systems of cultural production can be more efficient, more fair, and more conducive to human flourishing than conventional copyright systems. This article explains what we know about free-riding so far and what work remains to be done to understand the viability and importance of cooperative systems in funding cultural production.
Resumo:
The effects of oxygen availability and induction culture biomass upon production of an industrially important monoamine oxidase (MAO) were investigated in fed-batch cultures of a recombinant E. coli. For each induction cell biomass 2 different oxygenation methods were used, aeration and oxygen enriched air. Induction at higher biomass levels increased the culture demand for oxygen, leading to fermentative metabolism and accumulation of high levels of acetate in the aerated cultures. Paradoxically, despite an almost eight fold increase in acetate accumulation to levels widely reported to be highly detrimental to protein production, when induction wet cell weight (WCW) rose from 100% to 137.5%, MAO specific activity in these aerated processes showed a 3 fold increase. By contrast, for oxygenated cultures induced at WCW's 100% and 137.5% specific activity levels were broadly similar, but fell rapidly after the maxima were reached. Induction at high biomass levels (WCW 175%) led to very low levels of specific MAO activity relative to induction at lower WCW's in both aerated and oxygenated cultures. Oxygen enrichment of these cultures was a useful strategy for boosting specific growth rates, but did not have positive effects upon specific enzyme activity. Based upon our findings, consideration of the amino acid composition of MAO and previous studies on related enzymes, we propose that this effect is due to oxidative damage to the MAO enzyme itself during these highly aerobic processes. Thus, the optimal process for MAO production is aerated, not oxygenated, and induced at moderate cell density, and clearly represents a compromise between oxygen supply effects on specific growth rate/induction cell density, acetate accumulation, and high specific MAO activity. This work shows that the negative effects of oxygen previously reported in free enzyme preparations, are not limited to these acellular environments but are also discernible in the sheltered environment of the cytosol of E. coli cells.
Resumo:
Development of researchers through higher degree research studies is a high priority in most universities. Yet, research about supervision as pedagogy and models of supervision is only recently gained increasing attention. Charged with producing good researchers within very limited resources, academics are constantly looking for more efficient models of supervision for higher degree research students. A cohort model of supervision promises several efficiencies, but we argue that its success lies importantly on how well the cohort is developed specifically for higher degree research studies. We drew on a growing body of literature on higher degree research supervision to design, implement and evaluate our approach to developing a cohort of seven students enrolled in the Master of Education (Research) degree. Our approach included four provisions: initial residential workshop, development of a learning community, nourishing scholarship, and ongoing learning opportunities. The four provisions resulted in gradually developing an environment and culture that students found very supportive and nurturing. This paper is based on the findings from data collected from student evaluations in the first year of studies, feedback from the cohort’s sponsor, and our reflective notes. The evaluation substantiated the value in investing time and resources for purposely developing a cohort for higher degree research studies. Whether the cohorts are sponsored or not, universities will still need to invest time and resources for cohort development if a cohort model is intended to gain wider efficiencies in supervision of higher degree research students.
Resumo:
Software to create individualised finite element (FE) models of the osseoligamentous spine using pre-operative computed tomography (CT) data-sets for spinal surgery patients has recently been developed. This study presents a geometric sensitivity analysis of this software to assess the effect of intra-observer variability in user-selected anatomical landmarks. User-selected landmarks on the osseous anatomy were defined from CT data-sets for three scoliosis patients and these landmarks were used to reconstruct patient-specific anatomy of the spine and ribcage using parametric descriptions. The intra-observer errors in landmark co-ordinates for these anatomical landmarks were calculated. FE models of the spine and ribcage were created using the reconstructed anatomy for each patient and these models were analysed for a loadcase simulating clinical flexibility assessment. The intra-observer error in the anatomical measurements was low in comparison to the initial dimensions, with the exception of the angular measurements for disc wedge and zygapophyseal joint (z-joint) orientation and disc height. This variability suggested that CT resolution may influence such angular measurements, particularly for small anatomical features, such as the z-joints, and may also affect disc height. The results of the FE analysis showed low variation in the model predictions for spinal curvature with the mean intra-observer variability substantially less than the accepted error in clinical measurement. These findings demonstrate that intra-observer variability in landmark point selection has minimal effect on the subsequent FE predictions for a clinical loadcase.
Resumo:
Autonomous navigation and picture compilation tasks require robust feature descriptions or models. Given the non Gaussian nature of sensor observations, it will be shown that Gaussian mixture models provide a general probabilistic representation allowing analytical solutions to the update and prediction operations in the general Bayesian filtering problem. Each operation in the Bayesian filter for Gaussian mixture models multiplicatively increases the number of parameters in the representation leading to the need for a re-parameterisation step. A computationally efficient re-parameterisation step will be demonstrated resulting in a compact and accurate estimate of the true distribution.
Resumo:
This study investigated the effect of a calcium phosphate (CaP) coating onto a polycaprolactone melt electrospun scaffold and in vitro culture conditions on ectopic bone formation in a subcutaneous rat model. The CaP coating resulted in an increased alkaline phosphatase activity (ALP) in ovine osteoblasts regardless of the culture conditions and this was also translated into higher levels of mineralisation. A subcutaneous implantation was performed and increasing ectopic bone formation was observed over time for the CaPcoated samples previously cultured in osteogenic media whereas the corresponding non-coated samples displayed a lag phase before bone formation occurred from 4 to 8 weeks post-implantation. Histology and immunohistochemistry revealed bone fill through the scaffolds 8 weeks post-implantation for coated and non-coated specimens and that ALP, osteocalcin and collagen 1 were present at the ossification front and in the bone tissues. Vascularisation in the vicinity of the bone tissues was also observed indicating that the newly formed bone was not deprived of oxygen and nutrients.We found that in vitro osteogenic induction was essential for achieving bone formation and CaP coating accelerated the osteogenic process. We conclude that high cell density and preservation of the collagenous and mineralised extracellular matrix secreted in vitro are factors of importance for ectopic bone formation.
Resumo:
Automated process discovery techniques aim at extracting process models from information system logs. Existing techniques in this space are effective when applied to relatively small or regular logs, but generate spaghetti-like and sometimes inaccurate models when confronted to logs with high variability. In previous work, trace clustering has been applied in an attempt to reduce the size and complexity of automatically discovered process models. The idea is to split the log into clusters and to discover one model per cluster. This leads to a collection of process models – each one representing a variant of the business process – as opposed to an all-encompassing model. Still, models produced in this way may exhibit unacceptably high complexity and low fitness. In this setting, this paper presents a two-way divide-and-conquer process discovery technique, wherein the discovered process models are split on the one hand by variants and on the other hand hierarchically using subprocess extraction. Splitting is performed in a controlled manner in order to achieve user-defined complexity or fitness thresholds. Experiments on real-life logs show that the technique produces collections of models substantially smaller than those extracted by applying existing trace clustering techniques, while allowing the user to control the fitness of the resulting models.
Resumo:
Techniques for evaluating and selecting multivariate volatility forecasts are not yet understood as well as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a set of competing forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood-based loss function outperforms its competitors, including those based on the given portfolio application. This result indicates that considering the particular application of forecasts is not necessarily the most effective basis on which to select models.
Resumo:
Process Modeling is a widely used concept for understanding, documenting and also redesigning the operations of organizations. The validation and usage of process models is however affected by the fact that only business analysts fully understand them in detail. This is in particular a problem because they are typically not domain experts. In this paper, we investigate in how far the concept of verbalization can be adapted from object-role modeling to process models. To this end, we define an approach which automatically transforms BPMN process models into natural language texts and combines different techniques from linguistics and graph decomposition in a flexible and accurate manner. The evaluation of the technique is based on a prototypical implementation and involves a test set of 53 BPMN process models showing that natural language texts can be generated in a reliable fashion.
Resumo:
The motion response of marine structures in waves can be studied using finite-dimensional linear-time-invariant approximating models. These models, obtained using system identification with data computed by hydrodynamic codes, find application in offshore training simulators, hardware-in-the-loop simulators for positioning control testing, and also in initial designs of wave-energy conversion devices. Different proposals have appeared in the literature to address the identification problem in both time and frequency domains, and recent work has highlighted the superiority of the frequency-domain methods. This paper summarises practical frequency-domain estimation algorithms that use constraints on model structure and parameters to refine the search of approximating parametric models. Practical issues associated with the identification are discussed, including the influence of radiation model accuracy in force-to-motion models, which are usually the ultimate modelling objective. The illustration examples in the paper are obtained using a freely available MATLAB toolbox developed by the authors, which implements the estimation algorithms described.