918 resultados para Additive Fertigungsverfahren, Wirtschaftlichkeit, Qualität, Pre-Process,
Resumo:
A biomass pretreatment process was developed using acidified ionic liquid (IL) solutions containing 10-30% water. Pretreatment of sugarcane bagasse at 130°C for 30min by aqueous 1-butyl-3-methylimidazolium chloride (BMIMCl) solution containing 1.2% HCl resulted in a glucan digestibility of 94-100% after 72h of enzymatic hydrolysis. HCl was found to be a more effective catalyst than H(2)SO(4) or FeCl(3). Increasing acid concentration (from 0.4% to 1.2%) and reaction temperature (from 90 to 130°C) increased glucan digestibility. The glucan digestibility of solid residue obtained with the acidified BMIMCl solution that was re-used for three times was >97%. The addition of water to ILs for pretreatment could significantly reduce IL solvent costs and allow for increased biomass loadings, making the pretreatment by ILs a more economic proposition.
Resumo:
The renovation of biomass waste in the form of date seed waste into activated carbon and biofuel by fixed bed pyrolysis reactor has been focused in this study to obtain gaseous, liquid, and solid products. The date seed in particle form is pyrolysed in an externally heated fixed bed reactor with nitrogen as the carrier gas. The reactor is heated from 400◦C to 600◦C. A maximum liquid yield of 50wt.% and char of 30wt.% are obtained at a reactor bed temperature of 500◦C with a running time of 120 minutes. The oil is found to possess favorable flash point and reasonable density and viscosity. The higher calorific value is found to be 28.636 MJ/kg which is significantly higher than other biomass derived. Decolonization of 85–97% is recorded for the textile effluent and 75–90% for the tannery effluent, in all cases decreasing with temperature increase. Good adsorption capacity of the prepared activated carbon in case of diluted textile and tannery effluent was found.
Resumo:
In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.
Resumo:
Wayfinding is the process of finding your way to a destination in a familiar or unfamiliar setting using any cues given by the environment. Due to its ubiquity in everyday life, wayfinding appears on the surface to be a simply characterised and understood process, however this very ubiquity and the resulting need to refine and optimise wayfinding has lead to a great number of studies that have revealed that it is in fact a deeply complex exercise. In this paper we examine the motivations for investigating wayfinding, with particular attention being paid to the unique challenges faced in transportation hubs, and discuss the associated principles and factors involved as they have been perceived from different research perspectives.We also review the approaches used to date in the modelling of wayfinding in various contexts. We attempt to draw together the different perspectives applied to wayfinding and postulate the importance of wayfinding and the need to understand this seemingly simple, but concurrently complex, process.
Resumo:
A model has been developed to track the flow of cane constituents through the milling process. While previous models have tracked the flow of fibre, brix and water through the process, this model tracks the soluble and insoluble solid cane components using modelling theory and experiment data, assisting in further understanding the flow of constituents into mixed juice and final bagasse. The work provided an opportunity to understand the factors which affect the distribution of the cane constituents in juice and bagasse. Application of the model should lead to improvements in the overall performance of the milling train.
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Evidence is mounting that values education is providing positive outcomes for students, teachers and schools (Benninga, Berkowitz, Kuehn, & Smith, 2006; DEST, 2008; Hattie, 2003; Lovat, Clement, Dally, & Toomey, 2010). Despite this, Australian pre-service teacher education does not appear to be changing in ways necessary to support skilling teachers to teach with a values focus (Lovat, Dally, Clement, and Toomey, 2011). This article presents findings from a case study that explored current teachers’ perceptions of the skills pre-service teachers need to teach values education effectively. Teachers who currently teach with a values focus highlighted that pre-service teacher education degrees need to encourage an ongoing commitment to continual learning, critical reflection and growth in pre-service teachers, along with excellent questioning and listening skills. Further, they argued that pre-service teachers need to be skilled in recognising and responding to student diversity. This article ends by arguing for some changes that need to occur in pre-service teacher education in order for teachers to teach effectively with a values focus, including the need for stronger connections between pre-service and experienced teachers.
Resumo:
The representation of business process models has been a continuing research topic for many years now. However, many process model representations have not developed beyond minimally interactive 2D icon-based representations of directed graphs and networks, with little or no annotation for information over- lays. With the rise of desktop computers and commodity mobile devices capable of supporting rich interactive 3D environments, we believe that much of the research performed in computer human interaction, virtual reality, games and interactive entertainment has much potential in areas of BPM; to engage, pro- vide insight, and to promote collaboration amongst analysts and stakeholders alike. This initial visualization workshop seeks to initiate the development of a high quality international forum to present and discuss research in this field. Via this workshop, we intend to create a community to unify and nurture the development of process visualization topics as a continuing research area.
Resumo:
New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.
Resumo:
AR process modelling movie presented at Gartner BPM Summit in Sydney, August, 2011. Video showing us using the MS Surface at QUT to perform collaborative process modelling.
Resumo:
This chapter revolves around research-based insights into the entrepreneurial process. By is meant the process of setting up a new business activity resulting in a new market offer. This new offer may be made by a new or an existing firm, although the main focus here is on the start-up of new, independent firms. Further, the new offer may be innovative, bringing to the market something that was not offered before or imitative i.e., a new competitor enters the market with products or services very similar to what other firms are already offering. Although the lsatter type of process may be less complex and also have less market impact, it still entails most of the steps that typically have to be taken in order to get a business up and running. If successful, it also shares, at least to some degree, the consequences that signify entrepreneurial processes: - it gives consumers new choice alternatives - it gives incumbent firms reason to shape up - it attracts additional followers to enter the market, further reinforcing the first two effects (Davidsson, 2004). Besides, imitatlve starr-ups outnumber by far innovative ones (Reynolds et al., 2003; Samuelsson and Davidsson, 2009).
Resumo:
This paper re-conceptualises the notion of improvisation as it has developed in schools since the 1960s. It outlines the theoretical case for naming the distinctive improvisational practice that has emerged in schools - namely Process Drama. The paper sketches seven characteristics of Process Drama and then outlines the case for seeing it as an emerging but robust form of dramatic art. It concludes by arguing that drama teachers, in developing process drama, have created a new form of both education and art.
Resumo:
In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.
Resumo:
This paper investigates the effects of lane-changing in driver behavior by measuring (i) the induced transient behavior and (ii) the change in driver characteristics, i.e., changes in driver response time and minimum spacing. We find that the transition largely consists of a pre-insertion transition and a relaxation process. These two processes are different but can be reasonably captured with a single model. The findings also suggest that lane-changing induces a regressive effect on driver characteristics: a timid driver (characterized by larger response time and minimum spacing) tends to become less timid and an aggressive driver less aggressive. We offer an extension to Newell’s car-following model to describe this regressive effect and verify it using vehicle trajectory data.
Resumo:
Large-scale molecular dynamics simulations are performed to characterize the effects of pre-existing surface defects on the vibrational properties of Ag nanowires. It is found that the first order natural frequency of the nanowire appears insensitive to different surface defects, indicating a defect insensitivity property of the nanowire’s Young’s modulus. In the meanwhile, an increase of the quality (Q)-factor is observed due to the presence of defects. Particular, a beat phenomenon is observed for the nanowire with the presence of a surface edge defect, which is driven by a single actuation. It is concluded that different surface defects could act as an effective mean to tune the vibrational properties of nanowires. This study sheds lights on the better understanding of nanowire’s mechanical performance when surface defects are presented, which would benefit the development of nanowire-based devices.