636 resultados para Minimal models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

1.Description of the Work The Fleet Store was devised as a creative output to establish an exhibition linked to a fashion business model where emerging designers were encouraged to research new and innovative strategies for creating design-driven and commercial collections for a public consumer. This was a project that was devised to break down the perceptions of emerging fashion designers that designing commercial collections linked to a sustainable business model is a boring and unnecessary process. The focus was to demystify the business of fashion and to link its importance to a design-driven and public outcome that is more familiar to fashion designers. The criterion for participation was that all designers had to be registered as a business with the Australian Taxation Office. Designers were chosen from the Creative Enterprise Australia Fashion Business Incubator, the QUT fashion graduate alumni and current QUT fashion design and double degree (fashion and business) students with existing businesses. The project evolved from a series of collaborative workshops where designers were introduced to new and innovative creative industries’ business models and the processes, costings and timings involved to create a niche, sustainable business for a public exhibition of design-driven commercial collections. All designers initiated their own business infra-structure but were then introduced to the concept of collaboration for successful and profitable exhibition and business outcomes. Collaborative strategies such as crowd funding, crowd sourcing, peer to peer mentoring and manufacturing were all researched, and strategies for the establishment of the retail exhibition were all devised in a collaborative environment. All participants also took on roles outside their ‘designer’ background to create a retail exhibition that was creative but also had critical mass and aesthetic for the consumer. The Fleet Store ‘popped up’ for 2 weeks (10 days), in a heritage-listed building in an inner city location. Passers-by were important, but the main consumer was enlisted by the use of interest and investment from crowd sourcing, crowd funding, ethical marketing, corporate social responsibility projects and collaborative public relations and social media strategies. The research has furthered discussion on innovative strategies for emerging fashion designers to initiate and maintain sustainable businesses and suggests that collaboration combined with a design-driven and business focus can create a sustainable and economically viable retail exhibition. 2. Research Statement Research Background The research field involved developing a new ethical, design-driven, collaborative and sustainable model for fashion design practice and management. The research asked can a public, design-driven, collaborative retail exhibition create a platform for promoting creative, innovative and sustainable business models for emerging fashion designers. The methodology was primarily practice-led as all participants were designers in their own right and the project manager acted as a mentor and curator to guide the process and analyse the potential of the research question. The Fleet Store offers new knowledge in design practice and management; with the creation of a model where design outcomes and business models are inextricably linked to the success of the creative output. Key innovations include extending the commercialisation of emerging fashion businesses by creating a curated retail gallery for collaborative and sustainable strategies to support niche fashion designer labels. This has contributed to a broader conversation on how to nurture and sustain competitive Australian fashion designers/labels. Research Contribution and Significance The Fleet Store has contributed to a growing body of research into innovative and sustainable business models for niche fashion and creative industries’ practitioners. All participants have maintained their business infra-structure and many are currently growing their businesses, using the strategies tested for the Fleet Store. The exhibition space was visited by over 1,000 people and sales of $27,000 were made in 10 days of opening. (Follow up sales of $3,000 has also been reported.) Three of the designers were ‘discovered’ from the exhibition and have received substantial orders from high profile national buyers and retailers for next season delivery. Several participants have since collaborated to create other pop up retail environments and are now mentoring other emerging designers on the significance of a collaborative retail exhibition to consolidate niche business models for emerging fashion designers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Local spatio-temporal features with a Bag-of-visual words model is a popular approach used in human action recognition. Bag-of-features methods suffer from several challenges such as extracting appropriate appearance and motion features from videos, converting extracted features appropriate for classification and designing a suitable classification framework. In this paper we address the problem of efficiently representing the extracted features for classification to improve the overall performance. We introduce two generative supervised topic models, maximum entropy discrimination LDA (MedLDA) and class- specific simplex LDA (css-LDA), to encode the raw features suitable for discriminative SVM based classification. Unsupervised LDA models disconnect topic discovery from the classification task, hence yield poor results compared to the baseline Bag-of-words framework. On the other hand supervised LDA techniques learn the topic structure by considering the class labels and improve the recognition accuracy significantly. MedLDA maximizes likelihood and within class margins using max-margin techniques and yields a sparse highly discriminative topic structure; while in css-LDA separate class specific topics are learned instead of common set of topics across the entire dataset. In our representation first topics are learned and then each video is represented as a topic proportion vector, i.e. it can be comparable to a histogram of topics. Finally SVM classification is done on the learned topic proportion vector. We demonstrate the efficiency of the above two representation techniques through the experiments carried out in two popular datasets. Experimental results demonstrate significantly improved performance compared to the baseline Bag-of-features framework which uses kmeans to construct histogram of words from the feature vectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Railway capacity determination and expansion are very important topics. In prior research, the competition between different entities such as train services and train types, on different network corridors however have been ignored, poorly modelled, or else assumed to be static. In response, a comprehensive set of multi-objective models have been formulated in this article to perform a trade-off analysis. These models determine the total absolute capacity of railway networks as the most equitable solution according to a clearly defined set of competing objectives. The models also perform a sensitivity analysis of capacity with respect to those competing objectives. The models have been extensively tested on a case study and their significant worth is shown. The models were solved using a variety of techniques however an adaptive E constraint method was shown to be most superior. In order to identify only the best solution, a Simulated Annealing meta-heuristic was implemented and tested. However a linearization technique based upon separable programming was also developed and shown to be superior in terms of solution quality but far less in terms of computational time.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional sensitivity and elasticity analyses of matrix population models have been used to inform management decisions, but they ignore the economic costs of manipulating vital rates. For example, the growth rate of a population is often most sensitive to changes in adult survival rate, but this does not mean that increasing that rate is the best option for managing the population because it may be much more expensive than other options. To explore how managers should optimize their manipulation of vital rates, we incorporated the cost of changing those rates into matrix population models. We derived analytic expressions for locations in parameter space where managers should shift between management of fecundity and survival, for the balance between fecundity and survival management at those boundaries, and for the allocation of management resources to sustain that optimal balance. For simple matrices, the optimal budget allocation can often be expressed as simple functions of vital rates and the relative costs of changing them. We applied our method to management of the Helmeted Honeyeater (Lichenostomus melanops cassidix; an endangered Australian bird) and the koala (Phascolarctos cinereus) as examples. Our method showed that cost-efficient management of the Helmeted Honeyeater should focus on increasing fecundity via nest protection, whereas optimal koala management should focus on manipulating both fecundity and survival simultaneously. These findings are contrary to the cost-negligent recommendations of elasticity analysis, which would suggest focusing on managing survival in both cases. A further investigation of Helmeted Honeyeater management options, based on an individual-based model incorporating density dependence, spatial structure, and environmental stochasticity, confirmed that fecundity management was the most cost-effective strategy. Our results demonstrate that decisions that ignore economic factors will reduce management efficiency. ©2006 Society for Conservation Biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis focused upon the development of improved capacity analysis and capacity planning techniques for railways. A number of innovations were made and were tested on a case study of a real national railway. These techniques can reduce the time required to perform decision making activities that planners and managers need to perform. As all railways need to be expanded to meet increasing demands, the presumption that analytical capacity models can be used to identify how best to improve an existing network at least cost, was fully investigated. Track duplication was the mechanism used to expanding a network's capacity, and two variant capacity expansion models were formulated. Another outcome of this thesis is the development and validation of bi objective models for capacity analysis. These models regulate the competition for track access and perform a trade-off analysis. An opportunity to develop more general mulch-objective approaches was identified.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wound healing and tumour growth involve collective cell spreading, which is driven by individual motility and proliferation events within a population of cells. Mathematical models are often used to interpret experimental data and to estimate the parameters so that predictions can be made. Existing methods for parameter estimation typically assume that these parameters are constants and often ignore any uncertainty in the estimated values. We use approximate Bayesian computation (ABC) to estimate the cell diffusivity, D, and the cell proliferation rate, λ, from a discrete model of collective cell spreading, and we quantify the uncertainty associated with these estimates using Bayesian inference. We use a detailed experimental data set describing the collective cell spreading of 3T3 fibroblast cells. The ABC analysis is conducted for different combinations of initial cell densities and experimental times in two separate scenarios: (i) where collective cell spreading is driven by cell motility alone, and (ii) where collective cell spreading is driven by combined cell motility and cell proliferation. We find that D can be estimated precisely, with a small coefficient of variation (CV) of 2–6%. Our results indicate that D appears to depend on the experimental time, which is a feature that has been previously overlooked. Assuming that the values of D are the same in both experimental scenarios, we use the information about D from the first experimental scenario to obtain reasonably precise estimates of λ, with a CV between 4 and 12%. Our estimates of D and λ are consistent with previously reported values; however, our method is based on a straightforward measurement of the position of the leading edge whereas previous approaches have involved expensive cell counting techniques. Additional insights gained using a fully Bayesian approach justify the computational cost, especially since it allows us to accommodate information from different experiments in a principled way.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a method for learning specific object representations that can be applied (and reused) in visual detection and identification tasks. A machine learning technique called Cartesian Genetic Programming (CGP) is used to create these models based on a series of images. Our research investigates how manipulation actions might allow for the development of better visual models and therefore better robot vision. This paper describes how visual object representations can be learned and improved by performing object manipulation actions, such as, poke, push and pick-up with a humanoid robot. The improvement can be measured and allows for the robot to select and perform the `right' action, i.e. the action with the best possible improvement of the detector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epigenetic changes correspond to heritable modifications of the chromosome structure, which do not involve alteration of the DNA sequence but do affect gene expression. These mechanisms play an important role in normal cell differentiation, but aberration is associated also with several diseases, including cancer and neural disorders. In consequence, despite intensive studies in recent years, the contribution of modifications remains largely unquantified due to overall system complexity and insufficient data. Computational models can provide powerful auxiliary tools to experimentation, not least as scales from the sub-cellular through cell populations (or to networks of genes) can be spanned. In this paper, the challenges to development, of realistic cross-scale models, are discussed and illustrated with respect to current work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tumour microenvironment greatly influences the development and metastasis of cancer progression. The development of three dimensional (3D) culture models which mimic that displayed in vivo can improve cancer biology studies and accelerate novel anticancer drug screening. Inspired by a systems biology approach, we have formed 3D in vitro bioengineered tumour angiogenesis microenvironments within a glycosaminoglycan-based hydrogel culture system. This microenvironment model can routinely recreate breast and prostate tumour vascularisation. The multiple cell types cultured within this model were less sensitive to chemotherapy when compared with two dimensional (2D) cultures, and displayed comparative tumour regression to that displayed in vivo. These features highlight the use of our in vitro culture model as a complementary testing platform in conjunction with animal models, addressing key reduction and replacement goals of the future. We anticipate that this biomimetic model will provide a platform for the in-depth analysis of cancer development and the discovery of novel therapeutic targets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

GVHD remains the major complication of allo-HSCT. Murine models are the primary system used to understand GVHD, and to develop potential therapies. Several factors are critical for GVHD in these models; including histo- compatibility, conditioning regimen, and T-cell number. We serendipitously found that environmental factors such as the caging system and bedding also significantly impact the kinetics of GVHD in these models. This is important because such factors may influence the experimental conditions required to cause GVHD and how mice respond to various treatments. Consequently, this is likely to alter interpretation of results between research groups, and the perceived effectiveness of experimental therapies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To identify current ED models of care and their impact on care quality, care effectiveness, and cost. A systematic search of key health databases (Medline, CINAHL, Cochrane, EMbase) was conducted to identify literature on ED models of care. Additionally, a focused review of the contents of 11 international and national emergency medicine, nursing and health economic journals (published between 2010 and 2013) was undertaken with snowball identification of references of the most recent and relevant papers. Articles published between 1998 and 2013 in the English language were included for initial review by three of the authors. Studies in underdeveloped countries and not addressing the objectives of the present study were excluded. Relevant details were extracted from the retrieved literature, and analysed for relevance and impact. The literature was synthesised around the study's main themes. Models described within the literature mainly focused on addressing issues at the input, throughput or output stages of ED care delivery. Models often varied to account for site specific characteristics (e.g. onsite inpatient units) or to suit staffing profiles (e.g. extended scope physiotherapist), ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Only a few studies conducted cost-effectiveness analysis of service models. Although various models of delivering emergency healthcare exist, further research is required in order to make accurate and reliable assessments of their safety, clinical effectiveness and cost-effectiveness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Existing techniques for automated discovery of process models from event logs gen- erally produce flat process models. Thus, they fail to exploit the notion of subprocess as well as error handling and repetition constructs provided by contemporary process modeling notations, such as the Business Process Model and Notation (BPMN). This paper presents a technique for automated discovery of hierarchical BPMN models con- taining interrupting and non-interrupting boundary events and activity markers. The technique employs functional and inclusion dependency discovery techniques in order to elicit a process-subprocess hierarchy from the event log. Given this hierarchy and the projected logs associated to each node in the hierarchy, parent process and subprocess models are then discovered using existing techniques for flat process model discovery. Finally, the resulting models and logs are heuristically analyzed in order to identify boundary events and markers. By employing approximate dependency discovery tech- niques, it is possible to filter out noise in the event log arising for example from data entry errors or missing events. A validation with one synthetic and two real-life logs shows that process models derived by the proposed technique are more accurate and less complex than those derived with flat process discovery techniques. Meanwhile, a validation on a family of synthetically generated logs shows that the technique is resilient to varying levels of noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.