873 resultados para Product cost model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neutron multiplicities for several targets and spallation products of proton-induced reactions in thin targets of interest to an accelerator-driven system obtained with the CRISP code have been reported. This code is a Monte Carlo calculation that simulates the intranuclear cascade and evaporationl fission competition processes. Results are compared with experimental data, and agreement between each other can be considered quite satisfactory in a very broad energy range of incitant particles and different targets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Sznajd model (SM) has been employed with success in the last years to describe opinion propagation in a community. In particular, it has been claimed that its transient is able to reproduce some scale properties observed in data of proportional elections, in different countries, if the community structure (the network) is scale-free. In this work, we investigate the properties of the transient of a particular version of the SM, introduced by Bernardes and co-authors in 2002. We studied the behavior of the model in networks of different topologies through the time evolution of an order parameter known as interface density, and concluded that regular lattices with high dimensionality also leads to a power-law distribution of the number of candidates with v votes. Also, we show that the particular absorbing state achieved in the stationary state (or else, the winner candidate), is related to a particular feature of the model, that may not be realistic in all situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the three-particle scattering S-matrix for the Landau-Lifshitz model by directly computing the set of the Feynman diagrams up to the second order. We show, following the analogous computations for the non-linear Schrdinger model [1, 2], that the three-particle S-matrix is factorizable in the first non-trivial order.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the two-dimensional version of a drainage network model introduced ill Gangopadhyay, Roy and Sarkar (2004), and show that the appropriately rescaled family of its paths converges in distribution to the Brownian web. We do so by verifying the convergence criteria proposed in Fontes, Isopi, Newman and Ravishankar (2002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hajnal and Juhasz proved that under CH there is a hereditarily separable, hereditarily normal topological group without non-trivial convergent sequences that is countably compact and not Lindelof. The example constructed is a topological subgroup H subset of 2(omega 1) that is an HFD with the following property (P) the projection of H onto every partial product 2(I) for I is an element of vertical bar omega(1)vertical bar(omega) is onto. Any such group has the necessary properties. We prove that if kappa is a cardinal of uncountable cofinality, then in the model obtained by forcing over a model of CH with the measure algebra on 2(kappa), there is an HFD topological group in 2(omega 1) which has property (P). Crown Copyright (C) 2009 Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parkinson's disease (PD) is a degenerative illness whose cardinal symptoms include rigidity, tremor, and slowness of movement. In addition to its widely recognized effects PD can have a profound effect on speech and voice.The speech symptoms most commonly demonstrated by patients with PD are reduced vocal loudness, monopitch, disruptions of voice quality, and abnormally fast rate of speech. This cluster of speech symptoms is often termed Hypokinetic Dysarthria.The disease can be difficult to diagnose accurately, especially in its early stages, due to this reason, automatic techniques based on Artificial Intelligence should increase the diagnosing accuracy and to help the doctors make better decisions. The aim of the thesis work is to predict the PD based on the audio files collected from various patients.Audio files are preprocessed in order to attain the features.The preprocessed data contains 23 attributes and 195 instances. On an average there are six voice recordings per person, By using data compression technique such as Discrete Cosine Transform (DCT) number of instances can be minimized, after data compression, attribute selection is done using several WEKA build in methods such as ChiSquared, GainRatio, Infogain after identifying the important attributes, we evaluate attributes one by one by using stepwise regression.Based on the selected attributes we process in WEKA by using cost sensitive classifier with various algorithms like MultiPass LVQ, Logistic Model Tree(LMT), K-Star.The classified results shows on an average 80%.By using this features 95% approximate classification of PD is acheived.This shows that using the audio dataset, PD could be predicted with a higher level of accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cost of a road construction over its service life is a function of design, quality of construction as well as maintenance strategies and operations. An optimal life-cycle cost for a road requires evaluations of the above mentioned components. Unfortunately, road designers often neglect a very important aspect, namely, the possibility to perform future maintenance activities. Focus is mainly directed towards other aspects such as investment costs, traffic safety, aesthetic appearance, regional development and environmental effects. This doctoral thesis presents the results of a research project aimed to increase consideration of road maintenance aspects in the planning and design process. The following subgoals were established: Identify the obstacles that prevent adequate consideration of future maintenance during the road planning and design process; and Examine optimisation of life-cycle costs as an approach towards increased efficiency during the road planning and design process. The research project started with a literature review aimed at evaluating the extent to which maintenance aspects are considered during road planning and design as an improvement potential for maintenance efficiency. Efforts made by road authorities to increase efficiency, especially maintenance efficiency, were evaluated. The results indicated that all the evaluated efforts had one thing in common, namely ignorance of the interrelationship between geometrical road design and maintenance as an effective tool to increase maintenance efficiency. Focus has mainly been on improving operating practises and maintenance procedures. This fact might also explain why some efforts to increase maintenance efficiency have been less successful. An investigation was conducted to identify the problems and difficulties, which obstruct due consideration of maintainability during the road planning and design process. A method called “Change Analysis” was used to analyse data collected during interviews with experts in road design and maintenance. The study indicated a complex combination of problems which result in inadequate consideration of maintenance aspects when planning and designing roads. The identified problems were classified into six categories: insufficient consulting, insufficient knowledge, regulations and specifications without consideration of maintenance aspects, insufficient planning and design activities, inadequate organisation and demands from other authorities. Several urgent needs for changes to eliminate these problems were identified. One of the problems identified in the above mentioned study as an obstacle for due consideration of maintenance aspects during road design was the absence of a model for calculating life-cycle costs for roads. Because of this lack of knowledge, the research project focused on implementing a new approach for calculating and analysing life-cycle costs for roads with emphasis on the relationship between road design and road maintainability. Road barriers were chosen as an example. The ambition is to develop this approach to cover other road components at a later stage. A study was conducted to quantify repair rates for barriers and associated repair costs as one of the major maintenance costs for road barriers. A method called “Case Study Research Method” was used to analyse the effect of several factors on barrier repairs costs, such as barrier type, road type, posted speed and seasonal effect. The analyses were based on documented data associated with 1625 repairs conducted in four different geographical regions in Sweden during 2006. A model for calculation of average repair costs per vehicle kilometres was created. Significant differences in the barrier repair costs were found between the studied barrier types. In another study, the injuries associated with road barrier collisions and the corresponding influencing factors were analysed. The analyses in this study were based on documented data from actual barrier collisions between 2005 and 2008 in Sweden. The result was used to calculate the cost for injuries associated with barrier collisions as a part of the socio-economic cost for road barriers. The results showed significant differences in the number of injuries associated with collisions with different barrier types. To calculate and analyse life-cycle costs for road barriers a new approach was developed based on a method called “Activity-based Life-cycle Costing”. By modelling uncertainties, the presented approach gives a possibility to identify and analyse factors crucial for optimising life-cycle costs. The study showed a great potential to increase road maintenance efficiency through road design. It also showed that road components with low investment costs might not be the best choice when including maintenance and socio-economic aspects. The difficulties and problems faced during the collection of data for calculating life-cycle costs for road barriers indicated a great need for improving current data collecting and archiving procedures. The research focused on Swedish road planning and design. However, the conclusions can be applied to other Nordic countries, where weather conditions and road design practices are similar. The general methodological approaches used in this research project may be applied also to other studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To identify the relevant product markets for Swedish pharmaceuticals, a spatial econometrics approach is employed. First, we calculate Moran’s Is for different market definitions and then we use a spatial Durbin model to determine the effect of price changes on quantity sold off own and competing products. As expected, the results show that competition is strongest between close substitutes; however, the relevant product markets for Swedish pharmaceuticals extend beyond close substitutes down to products included in the same class on the four-digit level of the Anatomic Therapeutic Chemical system as defined by the World Health Organization. The spatial regression model further indicates that increases in the price of a product significantly lower the quantity sold of that product and in the same time increase the quantity sold of competing products. For close substitutes (products belonging to the same class on the seven-digit level of the Anatomic Therapeutic Chemical system), as well as for products that, without being close substitutes, belong to the same therapeutic/pharmacological/chemical subgroup (the same class on the five-digit level of the Anatomic Therapeutic Chemical system), a significant change towards increased competition is also visible after 1 July 2009 when the latest policy changes with regards to pharmaceuticals have been implemented in Sweden.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Product verifications have become a cost-intensive and time-consuming aspect of modern electronics production, but with the onset of an ever-increasing miniaturisation, these aspects will become even more cumbersome. One may also go as far as to point out that certain precision assembly, such as within the biomedical sector, is legally bound to have 0 defects within production. Since miniaturisation and precision assembly will soon become a part of almost any product, the verifications phases of assembly need to be optimised in both functionality and cost. Another aspect relates to the stability and robustness of processes, a pre-requisite for flexibility. Furthermore, as the re-engineering cycle becomes ever more important, all information gathered within the ongoing process becomes vital. In view of these points, product, or process verification may be assumed to be an important and integral part of precision assembly. In this paper, product verification is defined as the process of determining whether or not the products, at a given phase in the life-cycle, fulfil the established specifications. Since the product is given its final form and function in the assembly, the product verification normally takes place somewhere in the assembly line which is the focus for this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Companies implement a module product assortment as a part of their strategy to, among others, shorten lead-times, increase the product quality and to create more product variants with fever parts. However, the increased number of variants becomes a challenging task for the personnel responsible for the product verifications. By implementing verifications at module level, so called MPV (Module Property Verification) several advantages ensue. The advantages is not only a decrease in cost of verifications, but also a decrease in repair times, occupied space, storages with spare parts, and repair tools. Further, MPV also give an increased product quality due to an increased understanding of which defects that may occur. As an approach to implement MPV, this paper discusses defects and verification processes based on a study at a Swedish company. It also describes a matrix which is used to map relations between company specific cost drivers and so called verification factors. The matrix may indicate cost drivers which have a large impact on the total cost of product verifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The specification of Quality of Service (QoS) constraints over software design requires measures that ensure such requirements are met by the delivered product. Achieving this goal is non-trivial, as it involves, at least, identifying how QoS constraint specifications should be checked at the runtime. In this paper we present an implementation of a Model Driven Architecture (MDA) based framework for the runtime monitoring of QoS properties. We incorporate the UML2 superstructure and the UML profile for Quality of Service to provide abstract descriptions of component-and-connector systems. We then define transformations that refine the UML2 models to conform with the Distributed Management Taskforce (DMTF) Common Information Model (CIM) (Distributed Management Task Force Inc. 2006), a schema standard for management and instrumentation of hardware and software. Finally, we provide a mapping the CIM metamodel to a .NET-based metamodel for implementation of the monitoring infrastructure utilising various .NET features including the Windows Management Instrumentation (WMI) interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Agent-oriented software engineering and software product lines are two promising software engineering techniques. Recent research work has been exploring their integration, namely multi-agent systems product lines (MAS-PLs), to promote reuse and variability management in the context of complex software systems. However, current product derivation approaches do not provide specific mechanisms to deal with MAS-PLs. This is essential because they typically encompass several concerns (e.g., trust, coordination, transaction, state persistence) that are constructed on the basis of heterogeneous technologies (e.g., object-oriented frameworks and platforms). In this paper, we propose the use of multi-level models to support the configuration knowledge specification and automatic product derivation of MAS-PLs. Our approach provides an agent-specific architecture model that uses abstractions and instantiation rules that are relevant to this application domain. In order to evaluate the feasibility and effectiveness of the proposed approach, we have implemented it as an extension of an existing product derivation tool, called GenArch. The approach has also been evaluated through the automatic instantiation of two MAS-PLs, demonstrating its potential and benefits to product derivation and configuration knowledge specification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Millions of unconscious calculations are made daily by pedestrians walking through the Colby College campus. I used ArcGIS to make a predictive spatial model that chose paths similar to those that are actually used by people on a regular basis. To make a viable model of how most travelers choose their way, I considered both the distance required and the type of traveling surface. I used an iterative process to develop a scheme for weighting travel costs which resulted in accurate least-cost paths to be predicted by ArcMap. The accuracy was confirmed when the calculated routes were compared to satellite photography and were found to overlap well-worn “shortcuts” taken between the paved paths throughout campus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate model projections show that climate change will further increase the risk of flooding in many regions of the world. There is a need for climate adaptation, but building new infrastructure or additional retention basins has its limits, especially in densely populated areas where open spaces are limited. Another solution is the more efficient use of the existing infrastructure. This research investigates a method for real-time flood control by means of existing gated weirs and retention basins. The method was tested for the specific study area of the Demer basin in Belgium but is generally applicable. Today, retention basins along the Demer River are controlled by means of adjustable gated weirs based on fixed logic rules. However, because of the high complexity of the system, only suboptimal results are achieved by these rules. By making use of precipitation forecasts and combined hydrological-hydraulic river models, the state of the river network can be predicted. To fasten the calculation speed, a conceptual river model was used. The conceptual model was combined with a Model Predictive Control (MPC) algorithm and a Genetic Algorithm (GA). The MPC algorithm predicts the state of the river network depending on the positions of the adjustable weirs in the basin. The GA generates these positions in a semi-random way. Cost functions, based on water levels, were introduced to evaluate the efficiency of each generation, based on flood damage minimization. In the final phase of this research the influence of the most important MPC and GA parameters was investigated by means of a sensitivity study. The results show that the MPC-GA algorithm manages to reduce the total flood volume during the historical event of September 1998 by 46% in comparison with the current regulation. Based on the MPC-GA results, some recommendations could be formulated to improve the logic rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Canada releases over 150 billion litres of untreated and undertreated wastewater into the water environment every year1. To clean up urban wastewater, new Federal Wastewater Systems Effluent Regulations (WSER) on establishing national baseline effluent quality standards that are achievable through secondary wastewater treatment were enacted on July 18, 2012. With respect to the wastewater from the combined sewer overflows (CSO), the Regulations require the municipalities to report the annual quantity and frequency of effluent discharges. The City of Toronto currently has about 300 CSO locations within an area of approximately 16,550 hectares. The total sewer length of the CSO area is about 3,450 km and the number of sewer manholes is about 51,100. A system-wide monitoring of all CSO locations has never been undertaken due to the cost and practicality. Instead, the City has relied on estimation methods and modelling approaches in the past to allow funds that would otherwise be used for monitoring to be applied to the reduction of the impacts of the CSOs. To fulfill the WSER requirements, the City is now undertaking a study in which GIS-based hydrologic and hydraulic modelling is the approach. Results show the usefulness of this for 1) determining the flows contributing to the combined sewer system in the local and trunk sewers for dry weather flow, wet weather flow, and snowmelt conditions; 2) assessing hydraulic grade line and surface water depth in all the local and trunk sewers under heavy rain events; 3) analysis of local and trunk sewer capacities for future growth; and 4) reporting of the annual quantity and frequency of CSOs as per the requirements in the new Regulations. This modelling approach has also allowed funds to be applied toward reducing and ultimately eliminating the adverse impacts of CSOs rather than expending resources on unnecessary and costly monitoring.