997 resultados para envelope models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The health of tollbooth workers is seriously threatened by long-term exposure to polluted air from vehicle exhausts. Using traffic data collected at a toll plaza, vehicle movements were simulated by a system dynamics model with different traffic volumes and toll collection procedures. This allowed the average travel time of vehicles to be calculated. A three-dimension Computational Fluid Dynamics (CFD) model was used with a k–ε turbulence model to simulate pollutant dispersion at the toll plaza for different traffic volumes and toll collection procedures. It was shown that pollutant concentration around tollbooths increases as traffic volume increases. Whether traffic volume is low or high (1500 vehicles/h or 2500 vehicles/h), pollutant concentration decreases if electronic toll collection (ETC) is adopted. In addition, pollutant concentration around tollbooths decreases as the proportion of ETC-equipped vehicles increases. However, if the proportion of ETC-equipped vehicles is very low and the traffic volume is not heavy, then pollutant concentration increases as the number of ETC lanes increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, two ideal formation models of serrated chips, the symmetric formation model and the unilateral right-angle formation model, have been established for the first time. Based on the ideal models and related adiabatic shear theory of serrated chip formation, the theoretical relationship among average tooth pitch, average tooth height and chip thickness are obtained. Further, the theoretical relation of the passivation coefficient of chip's sawtooth and the chip thickness compression ratio is deduced as well. The comparison between these theoretical prediction curves and experimental data shows good agreement, which well validates the robustness of the ideal chip formation models and the correctness of the theoretical deducing analysis. The proposed ideal models may have provided a simple but effective theoretical basis for succeeding research on serrated chip morphology. Finally, the influences of most principal cutting factors on serrated chip formation are discussed on the basis of a series of finite element simulation results for practical advices of controlling serrated chips in engineering application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Autonomous underwater gliders are robust and widely-used ocean sampling platforms that are characterized by their endurance, and are one of the best approaches to gather subsurface data at the appropriate spatial resolution to advance our knowledge of the ocean environment. Gliders generally do not employ sophisticated sensors for underwater localization, but instead dead-reckon between set waypoints. Thus, these vehicles are subject to large positional errors between prescribed and actual surfacing locations. Here, we investigate the implementation of a large-scale, regional ocean model into the trajectory design for autonomous gliders to improve their navigational accuracy. We compute the dead-reckoning error for our Slocum gliders, and compare this to the average positional error recorded from multiple deployments conducted over the past year. We then compare trajectory plans computed on-board the vehicle during recent deployments to our prediction-based trajectory plans for 140 surfacing occurrences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data collection using Autonomous Underwater Vehicles (AUVs) is increasing in importance within the oceano- graphic research community. Contrary to traditional moored or static platforms, mobile sensors require intelligent planning strategies to manoeuvre through the ocean. However, the ability to navigate to high-value locations and collect data with specific scientific merit is worth the planning efforts. In this study, we examine the use of ocean model predictions to determine the locations to be visited by an AUV, and aid in planning the trajectory that the vehicle executes during the sampling mission. The objectives are: a) to provide near-real time, in situ measurements to a large-scale ocean model to increase the skill of future predictions, and b) to utilize ocean model predictions as a component in an end-to-end autonomous prediction and tasking system for aquatic, mobile sensor networks. We present an algorithm designed to generate paths for AUVs to track a dynamically evolving ocean feature utilizing ocean model predictions. This builds on previous work in this area by incorporating the predicted current velocities into the path planning to assist in solving the 3-D motion planning problem of steering an AUV between two selected locations. We present simulation results for tracking a fresh water plume by use of our algorithm. Additionally, we present experimental results from field trials that test the skill of the model used as well as the incorporation of the model predictions into an AUV trajectory planner. These results indicate a modest, but measurable, improvement in surfacing error when the model predictions are incorporated into the planner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them being used for information systems development. In this paper, we examine two factors that we predict will influence the understanding of a business process that novice developers obtain from a corresponding process model: the content presentation form chosen to articulate the business domain, and the user characteristics of the novice developers working with the model. Our experimental study provides evidence that novice developers obtain similar levels of understanding when confronted with an unfamiliar or a familiar process model. However, previous modeling experience, the use of English as a second language, and previous work experience in BPM are important influencing factors of model understanding. Our findings suggest that education and research in process modeling should increase the focus on human factors and how they relate to content and content presentation formats for different modeling tasks. We discuss implications for practice and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In automatic facial expression detection, very accurate registration is desired which can be achieved via a deformable model approach where a dense mesh of 60-70 points on the face is used, such as an active appearance model (AAM). However, for applications where manually labeling frames is prohibitive, AAMs do not work well as they do not generalize well to unseen subjects. As such, a more coarse approach is taken for person-independent facial expression detection, where just a couple of key features (such as face and eyes) are tracked using a Viola-Jones type approach. The tracked image is normally post-processed to encode for shift and illumination invariance using a linear bank of filters. Recently, it was shown that this preprocessing step is of no benefit when close to ideal registration has been obtained. In this paper, we present a system based on the Constrained Local Model (CLM) which is a generic or person-independent face alignment algorithm which gains high accuracy. We show these results against the LBP feature extraction on the CK+ and GEMEP datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term Design is used to describe a wide range of activities. Like the term innovation, it is often used to describe both an activity and an outcome. Many products and services are often described as being designed, as they describe a conscious process of linking form and function. Alternatively, the many and varied processes of design are often used to describe a cost centre of an organisation to demonstrate a particular competency. However design is often not used to describe the ‘value’ it provides to an organisation and more importantly the ‘value’ it provides to both existing and future customers. Design Led Innovation bridges this gap. Design Led Innovation is a process of creating a sustainable competitive advantage, by radically changing the customer value proposition. A conceptual model has been developed to assist organisations apply and embed design in a company’s vision, strategy, culture, leadership and development processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many product categories of durable goods such as TV, PC, and DVD player, the largest component of sales is generated by consumers replacing existing units. Aggregate sales models proposed by diffusion of innovation researchers for the replacement component of sales have incorporated several different replacement distributions such as Rayleigh, Weibull, Truncated Normal and Gamma. Although these alternative replacement distributions have been tested using both time series sales data and individual-level actuarial “life-tables” of replacement ages, there is no census on which distributions are more appropriate to model replacement behaviour. In the current study we are motivated to develop a new “modified gamma” distribution by two reasons. First we recognise that replacements have two fundamentally different drivers – those forced by failure and early, discretionary replacements. The replacement distribution for each of these drivers is expected to be quite different. Second, we observed a poor fit of other distributions to out empirical data. We conducted a survey of 8,077 households to empirically examine models of replacement sales for six electronic consumer durables – TVs, VCRs, DVD players, digital cameras, personal and notebook computers. This data allows us to construct individual-level “life-tables” for replacement ages. We demonstrate the new modified gamma model fits the empirical data better than existing models for all six products using both a primary and a hold-out sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methicillin-resistant Staphylococcus Aureus (MRSA) is a pathogen that continues to be of major concern in hospitals. We develop models and computational schemes based on observed weekly incidence data to estimate MRSA transmission parameters. We extend the deterministic model of McBryde, Pettitt, and McElwain (2007, Journal of Theoretical Biology 245, 470–481) involving an underlying population of MRSA colonized patients and health-care workers that describes, among other processes, transmission between uncolonized patients and colonized health-care workers and vice versa. We develop new bivariate and trivariate Markov models to include incidence so that estimated transmission rates can be based directly on new colonizations rather than indirectly on prevalence. Imperfect sensitivity of pathogen detection is modeled using a hidden Markov process. The advantages of our approach include (i) a discrete valued assumption for the number of colonized health-care workers, (ii) two transmission parameters can be incorporated into the likelihood, (iii) the likelihood depends on the number of new cases to improve precision of inference, (iv) individual patient records are not required, and (v) the possibility of imperfect detection of colonization is incorporated. We compare our approach with that used by McBryde et al. (2007) based on an approximation that eliminates the health-care workers from the model, uses Markov chain Monte Carlo and individual patient data. We apply these models to MRSA colonization data collected in a small intensive care unit at the Princess Alexandra Hospital, Brisbane, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three recent papers published in Chemical Engineering Journal studied the solution of a model of diffusion and nonlinear reaction using three different methods. Two of these studies obtained series solutions using specialized mathematical methods, known as the Adomian decomposition method and the homotopy analysis method. Subsequently it was shown that the solution of the same particular model could be written in terms of a transcendental function called Gauss’ hypergeometric function. These three previous approaches focused on one particular reactive transport model. This particular model ignored advective transport and considered one specific reaction term only. Here we generalize these previous approaches and develop an exact analytical solution for a general class of steady state reactive transport models that incorporate (i) combined advective and diffusive transport, and (ii) any sufficiently differentiable reaction term R(C). The new solution is a convergent Maclaurin series. The Maclaurin series solution can be derived without any specialized mathematical methods nor does it necessarily involve the computation of any transcendental function. Applying the Maclaurin series solution to certain case studies shows that the previously published solutions are particular cases of the more general solution outlined here. We also demonstrate the accuracy of the Maclaurin series solution by comparing with numerical solutions for particular cases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As organizations reach to higher levels of business process management maturity, they often find themselves maintaining repositories of hundreds or even thousands of process models, representing valuable knowledge about their operations. Over time, process model repositories tend to accumulate duplicate fragments (also called clones) as new process models are created or extended by copying and merging fragments from other models. This calls for methods to detect clones in process models, so that these clones can be refactored as separate subprocesses in order to improve maintainability. This paper presents an indexing structure to support the fast detection of clones in large process model repositories. The proposed index is based on a novel combination of a method for process model decomposition (specifically the Refined Process Structure Tree), with established graph canonization and string matching techniques. Experiments show that the algorithm scales to repositories with hundreds of models. The experimental results also show that a significant number of non-trivial clones can be found in process model repositories taken from industrial practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process models in organizational collections are typically modeled by the same team and using the same conventions. As such, these models share many characteristic features like size range, type and frequency of errors. In most cases merely small samples of these collections are available due to e.g. the sensitive information they contain. Because of their sizes, these samples may not provide an accurate representation of the characteristics of the originating collection. This paper deals with the problem of constructing collections of process models, in the form of Petri nets, from small samples of a collection for accurate estimations of the characteristics of this collection. Given a small sample of process models drawn from a real-life collection, we mine a set of generation parameters that we use to generate arbitrary-large collections that feature the same characteristics of the original collection. In this way we can estimate the characteristics of the original collection on the generated collections.We extensively evaluate the quality of our technique on various sample datasets drawn from both research and industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As order dependencies between process tasks can get complex, it is easy to make mistakes in process model design, especially behavioral ones such as deadlocks. Notions such as soundness formalize behavioral errors and tools exist that can identify such errors. However these tools do not provide assistance with the correction of the process models. Error correction can be very challenging as the intentions of the process modeler are not known and there may be many ways in which an error can be corrected. We present a novel technique for automatic error correction in process models based on simulated annealing. Via this technique a number of process model alternatives are identified that resolve one or more errors in the original model. The technique is implemented and validated on a sample of industrial process models. The tests show that at least one sound solution can be found for each input model and that the response times are short.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plant biosecurity requires statistical tools to interpret field surveillance data in order to manage pest incursions that threaten crop production and trade. Ultimately, management decisions need to be based on the probability that an area is infested or free of a pest. Current informal approaches to delimiting pest extent rely upon expert ecological interpretation of presence / absence data over space and time. Hierarchical Bayesian models provide a cohesive statistical framework that can formally integrate the available information on both pest ecology and data. The overarching method involves constructing an observation model for the surveillance data, conditional on the hidden extent of the pest and uncertain detection sensitivity. The extent of the pest is then modelled as a dynamic invasion process that includes uncertainty in ecological parameters. Modelling approaches to assimilate this information are explored through case studies on spiralling whitefly, Aleurodicus dispersus and red banded mango caterpillar, Deanolis sublimbalis. Markov chain Monte Carlo simulation is used to estimate the probable extent of pests, given the observation and process model conditioned by surveillance data. Statistical methods, based on time-to-event models, are developed to apply hierarchical Bayesian models to early detection programs and to demonstrate area freedom from pests. The value of early detection surveillance programs is demonstrated through an application to interpret surveillance data for exotic plant pests with uncertain spread rates. The model suggests that typical early detection programs provide a moderate reduction in the probability of an area being infested but a dramatic reduction in the expected area of incursions at a given time. Estimates of spiralling whitefly extent are examined at local, district and state-wide scales. The local model estimates the rate of natural spread and the influence of host architecture, host suitability and inspector efficiency. These parameter estimates can support the development of robust surveillance programs. Hierarchical Bayesian models for the human-mediated spread of spiralling whitefly are developed for the colonisation of discrete cells connected by a modified gravity model. By estimating dispersal parameters, the model can be used to predict the extent of the pest over time. An extended model predicts the climate restricted distribution of the pest in Queensland. These novel human-mediated movement models are well suited to demonstrating area freedom at coarse spatio-temporal scales. At finer scales, and in the presence of ecological complexity, exploratory models are developed to investigate the capacity for surveillance information to estimate the extent of red banded mango caterpillar. It is apparent that excessive uncertainty about observation and ecological parameters can impose limits on inference at the scales required for effective management of response programs. The thesis contributes novel statistical approaches to estimating the extent of pests and develops applications to assist decision-making across a range of plant biosecurity surveillance activities. Hierarchical Bayesian modelling is demonstrated as both a useful analytical tool for estimating pest extent and a natural investigative paradigm for developing and focussing biosecurity programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modelling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for configurable process modelling is restricted, thus hindering their applicability. Specifically, these notations focus on capturing tasks and control-flow dependencies, neglecting equally important ingredients of business processes such as data and resources. This research fills this gap by proposing a configurable process modelling notation incorporating features for capturing resources, data and physical objects involved in the performance of tasks. The proposal has been implemented in a toolset that assists analysts during the configuration phase and guarantees the correctness of the resulting process models. The approach has been validated by means of a case study from the film industry.