124 resultados para Statistical mixture-design optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the implementation of time and energy efficient trajectories onto a test-bed autonomous underwater vehicle. The trajectories are losely connected to the results of the application of the maximum principle to the controlled mechanical system. We use a numerical algorithm to compute efficient trajectories designed using geometric control theory to optimize a given cost function. Experimental results are shown for the time minimization problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer resource allocation represents a significant challenge particularly for multiprocessor systems, which consist of shared computing resources to be allocated among co-runner processes and threads. While an efficient resource allocation would result in a highly efficient and stable overall multiprocessor system and individual thread performance, ineffective poor resource allocation causes significant performance bottlenecks even for the system with high computing resources. This thesis proposes a cache aware adaptive closed loop scheduling framework as an efficient resource allocation strategy for the highly dynamic resource management problem, which requires instant estimation of highly uncertain and unpredictable resource patterns. Many different approaches to this highly dynamic resource allocation problem have been developed but neither the dynamic nature nor the time-varying and uncertain characteristics of the resource allocation problem is well considered. These approaches facilitate either static and dynamic optimization methods or advanced scheduling algorithms such as the Proportional Fair (PFair) scheduling algorithm. Some of these approaches, which consider the dynamic nature of multiprocessor systems, apply only a basic closed loop system; hence, they fail to take the time-varying and uncertainty of the system into account. Therefore, further research into the multiprocessor resource allocation is required. Our closed loop cache aware adaptive scheduling framework takes the resource availability and the resource usage patterns into account by measuring time-varying factors such as cache miss counts, stalls and instruction counts. More specifically, the cache usage pattern of the thread is identified using QR recursive least square algorithm (RLS) and cache miss count time series statistics. For the identified cache resource dynamics, our closed loop cache aware adaptive scheduling framework enforces instruction fairness for the threads. Fairness in the context of our research project is defined as a resource allocation equity, which reduces corunner thread dependence in a shared resource environment. In this way, instruction count degradation due to shared cache resource conflicts is overcome. In this respect, our closed loop cache aware adaptive scheduling framework contributes to the research field in two major and three minor aspects. The two major contributions lead to the cache aware scheduling system. The first major contribution is the development of the execution fairness algorithm, which degrades the co-runner cache impact on the thread performance. The second contribution is the development of relevant mathematical models, such as thread execution pattern and cache access pattern models, which in fact formulate the execution fairness algorithm in terms of mathematical quantities. Following the development of the cache aware scheduling system, our adaptive self-tuning control framework is constructed to add an adaptive closed loop aspect to the cache aware scheduling system. This control framework in fact consists of two main components: the parameter estimator, and the controller design module. The first minor contribution is the development of the parameter estimators; the QR Recursive Least Square(RLS) algorithm is applied into our closed loop cache aware adaptive scheduling framework to estimate highly uncertain and time-varying cache resource patterns of threads. The second minor contribution is the designing of a controller design module; the algebraic controller design algorithm, Pole Placement, is utilized to design the relevant controller, which is able to provide desired timevarying control action. The adaptive self-tuning control framework and cache aware scheduling system in fact constitute our final framework, closed loop cache aware adaptive scheduling framework. The third minor contribution is to validate this cache aware adaptive closed loop scheduling framework efficiency in overwhelming the co-runner cache dependency. The timeseries statistical counters are developed for M-Sim Multi-Core Simulator; and the theoretical findings and mathematical formulations are applied as MATLAB m-file software codes. In this way, the overall framework is tested and experiment outcomes are analyzed. According to our experiment outcomes, it is concluded that our closed loop cache aware adaptive scheduling framework successfully drives co-runner cache dependent thread instruction count to co-runner independent instruction count with an error margin up to 25% in case cache is highly utilized. In addition, thread cache access pattern is also estimated with 75% accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Request For Proposal (RFP) with the design‐build (DB) procurement arrangement is a document in which an owner develops his requirements and conveys the project scope to DB contractors. Owners should provide an appropriate level of design in DB RFPs to adequately describe their requirements without compromising the prospects for innovation. This paper examines and compares the different levels of owner‐provided design in DB RFPs by the content analysis of 84 requests for RFPs for public DB projects advertised between 2000 and 2010 with an aggregate contract value of over $5.4 billion. A statistical analysis was also conducted in order to explore the relationship between the proportion of owner‐provided design and other project information, including project type, advertisement time, project size, contractor selection method, procurement process and contract type. The results show that the majority (64.8%) of the RFPs provide less than 10% of the owner‐provided design. The owner‐provided design proportion has a significant association with project type, project size, contractor selection method and contract type. In addition, owners are generally providing less design in recent years than hitherto. The research findings also provide owners with perspectives to determine the appropriate level of owner‐provided design in DB RFPs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Since users have become the focus of product/service design in last decade, the term User eXperience (UX) has been frequently used in the field of Human-Computer-Interaction (HCI). Research on UX facilitates a better understanding of the various aspects of the user’s interaction with the product or service. Mobile video, as a new and promising service and research field, has attracted great attention. Due to the significance of UX in the success of mobile video (Jordan, 2002), many researchers have centered on this area, examining users’ expectations, motivations, requirements, and usage context. As a result, many influencing factors have been explored (Buchinger, Kriglstein, Brandt & Hlavacs, 2011; Buchinger, Kriglstein & Hlavacs, 2009). However, a general framework for specific mobile video service is lacking for structuring such a great number of factors. To measure user experience of multimedia services such as mobile video, quality of experience (QoE) has recently become a prominent concept. In contrast to the traditionally used concept quality of service (QoS), QoE not only involves objectively measuring the delivered service but also takes into account user’s needs and desires when using the service, emphasizing the user’s overall acceptability on the service. Many QoE metrics are able to estimate the user perceived quality or acceptability of mobile video, but may be not enough accurate for the overall UX prediction due to the complexity of UX. Only a few frameworks of QoE have addressed more aspects of UX for mobile multimedia applications but need be transformed into practical measures. The challenge of optimizing UX remains adaptations to the resource constrains (e.g., network conditions, mobile device capabilities, and heterogeneous usage contexts) as well as meeting complicated user requirements (e.g., usage purposes and personal preferences). In this chapter, we investigate the existing important UX frameworks, compare their similarities and discuss some important features that fit in the mobile video service. Based on the previous research, we propose a simple UX framework for mobile video application by mapping a variety of influencing factors of UX upon a typical mobile video delivery system. Each component and its factors are explored with comprehensive literature reviews. The proposed framework may benefit in user-centred design of mobile video through taking a complete consideration of UX influences and in improvement of mobile videoservice quality by adjusting the values of certain factors to produce a positive user experience. It may also facilitate relative research in the way of locating important issues to study, clarifying research scopes, and setting up proper study procedures. We then review a great deal of research on UX measurement, including QoE metrics and QoE frameworks of mobile multimedia. Finally, we discuss how to achieve an optimal quality of user experience by focusing on the issues of various aspects of UX of mobile video. In the conclusion, we suggest some open issues for future study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although the design-build (DB) system has been demonstrated to be an effective delivery method and has gained popularity worldwide, it has not gained the same popularity in the construction market of China. The objective of this study was, theretofore, to investigate the barriers to entry in the DB market. A total of 22 entry barriers were first identified through an open-ended questionnaire survey with 15 top construction professionals in the construction market of China. A broad questionnaire survey was further conducted to prioritize these entry barriers. Statistical analysis of responses shows that the most dominant barriers to entry into the DB market are, namely, lack of design expertise, lack of interest from owners, lack of suitable organization structure, lack of DB specialists, and lack of credit record system. Analysis of variance indicates that there is no difference of opinions among the respondent groups of academia, government departments, state-owned company, and private company, at the 5% significance level, on most of the barriers to entry. Finally, the underlying dimensions of barriers to entry in the DB market were investigated through factor analysis. The results indicate that there are six major underlying dimensions of entry barriers in DB market, which include, namely, the competence of design-builders, difficulty in project procurement, characteristics of DB projects, lack of support from public sectors, the competence of DB owners, and the immaturity of DB market. These findings are useful for both potential and incumbent design-builders to understand and analyze the DB market in China.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Design for Manufacturing (DFM) is a highly integral methodology in product development, starting from the concept development phase, with the aim of improving manufacturing productivity. It is used to reduce manufacturing costs in complex production environments, while maintaining product quality. While Design for Assembly (DFA) is focusing on elimination or combination of parts with other components, which in most cases relates to performing a function and manufacture operation in a simpler way, DFM is following a more holistic approach. Common consideration for DFM are standard components, manufacturing tool inventory and capability, materials compatibility with production process, part handling, logistics, tool wear and process optimization, quality control complexity or Poka-Yoke design. During DFM, the considerable background work required for the conceptual phase is compensated for by a shortening of later development phases. Current DFM projects normally apply an iterative step-by-step approach and eventually transfer to the developer team. The study is introducing a new, knowledge based approach to DFM, eliminating steps of DFM, and showing implications on the work process. Furthermore, a concurrent engineering process via transparent interface between the manufacturing engineering and product development systems is brought forward.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Here we present a sequential Monte Carlo (SMC) algorithm that can be used for any one-at-a-time Bayesian sequential design problem in the presence of model uncertainty where discrete data are encountered. Our focus is on adaptive design for model discrimination but the methodology is applicable if one has a different design objective such as parameter estimation or prediction. An SMC algorithm is run in parallel for each model and the algorithm relies on a convenient estimator of the evidence of each model which is essentially a function of importance sampling weights. Other methods for this task such as quadrature, often used in design, suffer from the curse of dimensionality. Approximating posterior model probabilities in this way allows us to use model discrimination utility functions derived from information theory that were previously difficult to compute except for conjugate models. A major benefit of the algorithm is that it requires very little problem specific tuning. We demonstrate the methodology on three applications, including discriminating between models for decline in motor neuron numbers in patients suffering from neurological diseases such as Motor Neuron disease.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The favourable scaffold for bone tissue engineering should have desired characteristic features, such as adequate mechanical strength and three-dimensional open porosity, which guarantee a suitable environment for tissue regeneration. In fact, the design of such complex structures like bone scaffolds is a challenge for investigators. One of the aims is to achieve the best possible mechanical strength-degradation rate ratio. In this paper we attempt to use numerical modelling to evaluate material properties for designing bone tissue engineering scaffold fabricated via the fused deposition modelling technique. For our studies the standard genetic algorithm was used, which is an efficient method of discrete optimization. For the fused deposition modelling scaffold, each individual strut is scrutinized for its role in the architecture and structural support it provides for the scaffold, and its contribution to the overall scaffold was studied. The goal of the study was to create a numerical tool that could help to acquire the desired behaviour of tissue engineered scaffolds and our results showed that this could be achieved efficiently by using different materials for individual struts. To represent a great number of ways in which scaffold mechanical function loss could proceed, the exemplary set of different desirable scaffold stiffness loss function was chosen. © 2012 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of resource management on the building design process directly influences the development cycle time and success of construction projects. This paper presents the information constraint net (ICN) to represent the complex information constraint relations among design activities involved in the building design process. An algorithm is developed to transform the information constraints throughout the ICN into a Petri net model. A resource management model is developed using the ICN to simulate and optimize resource allocation in the design process. An example is provided to justify the proposed model through a simulation analysis of the CPN Tools platform in the detailed structural design. The result demonstrates that the proposed approach can obtain the resource management and optimization needed for shortening the development cycle and optimal allocation of resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Distributed Genetic Algorithms (DGAs) designed for the Internet have to take its high communication cost into consideration. For island model GAs, the migration topology has a major impact on DGA performance. This paper describes and evaluates an adaptive migration topology optimizer that keeps the communication load low while maintaining high solution quality. Experiments on benchmark problems show that the optimized topology outperforms static or random topologies of the same degree of connectivity. The applicability of the method on real-world problems is demonstrated on a hard optimization problem in VLSI design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vietnam has a unique culture which is revealed in the way that people have built and designed their traditional housing. Vietnamese dwellings reflect occupants’ activities in their everyday lives, while adapting to tropical climatic conditions impacted by seasoning monsoons. It is said that these characteristics of Vietnamese dwellings have remained unchanged until the economic reform in 1986, when Vietnam experienced an accelerated development based on the market-oriented economy. New housing types, including modern shop-houses, detached houses, and apartments, have been designed in many places, especially satisfying dwellers’ new lifestyles in Vietnamese cities. The contemporary housing, which has been mostly designed by architects, has reflected rules of spatial organisation so that occupants’ social activities are carried out. However, contemporary housing spaces seem unsustainable in relation to socio-cultural values because they has been influenced by globalism that advocates the use of homogeneous spatial patterns, modern technologies, materials and construction methods. This study investigates the rules of spaces in Vietnamese houses that were built before and after the reform to define the socio-cultural implications in Vietnamese housing design. Firstly, it describes occupants’ views of their current dwellings in terms of indoor comfort conditions and social activities in spaces. Then, it examines the use of spaces in pre-reform Vietnamese housing through occupants’ activities and material applications. Finally, it discusses the organisation of spaces in both pre- and post-reform housing to understand how Vietnamese housing has been designed for occupants to live, act, work, and conduct traditional activities. Understanding spatial organisation is a way to identify characteristics of the lived spaces of the occupants created from the conceived space, which is designed by designers. The characteristics of the housing spaces will inform the designers the way to design future Vietnamese housing in response to cultural contexts. The study applied an abductive approach for the investigation of housing spaces. It used a conceptual framework in relation to Henri Lefebvre’s (1991) theory to understand space as the main factor constituting the language of design, and the principles of semiotics to examine spatial structure in housing as a language used in the everyday life. The study involved a door-knocking survey to 350 households in four regional cities of Vietnam for interpretation of occupancy conditions and levels of occupants’ comfort. A statistical analysis was applied to interpret the survey data. The study also required a process of data selection and collection of fourteen cases of housing in three main climatic regions of the country for analysing spatial organisation and housing characteristics. The study found that there has been a shift in the relationship of spaces from the pre- to post-reform Vietnamese housing. It also indentified that the space for guest welcoming and family activity has been the central space of the Vietnamese housing. Based on the relationships of the central space with the others, theoretical models were proposed for three types of contemporary Vietnamese housing. The models will be significant in adapting to Vietnamese conditions to achieve socioenvironmental characteristics for housing design because it was developed from the occupants’ requirements for their social activities. Another contribution of the study is the use of methodological concepts to understand the language of living spaces. Further work will be needed to test future Vietnamese housing designs from the applications of the models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The selection of optimal camera configurations (camera locations, orientations etc.) for multi-camera networks remains an unsolved problem. Previous approaches largely focus on proposing various objective functions to achieve different tasks. Most of them, however, do not generalize well to large scale networks. To tackle this, we introduce a statistical formulation of the optimal selection of camera configurations as well as propose a Trans-Dimensional Simulated Annealing (TDSA) algorithm to effectively solve the problem. We compare our approach with a state-of-the-art method based on Binary Integer Programming (BIP) and show that our approach offers similar performance on small scale problems. However, we also demonstrate the capability of our approach in dealing with large scale problems and show that our approach produces better results than 2 alternative heuristics designed to deal with the scalability issue of BIP.