871 resultados para value stream analysis
Resumo:
The increasing availability of mobility data and the awareness of its importance and value have been motivating many researchers to the development of models and tools for analyzing movement data. This paper presents a brief survey of significant research works about modeling, processing and visualization of data about moving objects. We identified some key research fields that will provide better features for online analysis of movement data. As result of the literature review, we suggest a generic multi-layer architecture for the development of an online analysis processing software tool, which will be used for the definition of the future work of our team.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.
Resumo:
The aim of this paper is to analyze the determining factors for the pricing of handsets sold with service plans, using the hedonic price method. This was undertaken by building a database comprising 48 handset models, under nine different service plans, over a period of 53 weeks in 2008, and resulted in 27 different attributes and a total number of nearly 300,000 data registers. The results suggest that the value of monthly subscriptions and calling minutes are important to explain the prices of handsets. Furthermore, both the physical volume and number of megapixels of a camera had an effect on the prices. The bigger the handset, the cheaper it becomes, and the more megapixels a camera phone has, the more expensive it becomes. Additionally, it was found that in 2008 Brazilian phone companies were subsidizing enabled data connection handsets.
Resumo:
Value has been defined in different theoretical contexts as need, desire, interest, standard /criteria, beliefs, attitudes, and preferences. The creation of value is key to any business, and any business activity is about exchanging some tangible and/or intangible good or service and having its value accepted and rewarded by customers or clients, either inside the enterprise or collaborative network or outside. “Perhaps surprising then is that firms often do not know how to define value, or how to measure it” (Anderson and Narus, 1998 cited by [1]). Woodruff echoed that we need “richer customer value theory” for providing an “important tool for locking onto the critical things that managers need to know”. In addition, he emphasized, “we need customer value theory that delves deeply into customer’s world of product use in their situations” [2]. In this sense, we proposed and validated a novel “Conceptual Model for Decomposing the Value for the Customer”. To this end, we were aware that time has a direct impact on customer perceived value, and the suppliers’ and customers’ perceptions change from the pre-purchase to the post-purchase phases, causing some uncertainty and doubts.We wanted to break down value into all its components, as well as every built and used assets (both endogenous and/or exogenous perspectives). This component analysis was then transposed into a mathematical formulation using the Fuzzy Analytic Hierarchy Process (AHP), so that the uncertainty and vagueness of value perceptions could be embedded in this model that relates used and built assets in the tangible and intangible deliverable exchange among the involved parties, with their actual value perceptions.
Resumo:
Electricity markets are complex environments, involving a large number of different entities, playing in a dynamic scene to obtain the best advantages and profits. MASCEM is a multi-agent electricity market simu-lator to model market players and simulate their operation in the market. Market players are entities with specific characteristics and objectives, making their decisions and interacting with other players. MASCEM pro-vides several dynamic strategies for agents’ behaviour. This paper presents a method that aims to provide market players strategic bidding capabilities, allowing them to obtain the higher possible gains out of the market. This method uses an auxiliary forecasting tool, e.g. an Artificial Neural Net-work, to predict the electricity market prices, and analyses its forecasting error patterns. Through the recognition of such patterns occurrence, the method predicts the expected error for the next forecast, and uses it to adapt the actual forecast. The goal is to approximate the forecast to the real value, reducing the forecasting error.
Resumo:
This paper aims to study the relationships between chromosomal DNA sequences of twenty species. We propose a methodology combining DNA-based word frequency histograms, correlation methods, and an MDS technique to visualize structural information underlying chromosomes (CRs) and species. Four statistical measures are tested (Minkowski, Cosine, Pearson product-moment, and Kendall τ rank correlations) to analyze the information content of 421 nuclear CRs from twenty species. The proposed methodology is built on mathematical tools and allows the analysis and visualization of very large amounts of stream data, like DNA sequences, with almost no assumptions other than the predefined DNA “word length.” This methodology is able to produce comprehensible three-dimensional visualizations of CR clustering and related spatial and structural patterns. The results of the four test correlation scenarios show that the high-level information clusterings produced by the MDS tool are qualitatively similar, with small variations due to each correlation method characteristics, and that the clusterings are a consequence of the input data and not method’s artifacts.
Resumo:
Purpose – The aim of this article is to present some results from research undertaken into the information behaviour of European Documentation Centre (EDC) users. It will reflect on the practices of a group of 234 users of 55 EDCs covering 21 Member States of the European Union (EU), used to access European information. Design/methodology/approach – In order to collect the data presented here, five questionnaires were sent to users in all the EDCs in Finland, Ireland, Hungary and Portugal. In the remaining EU countries, five questionnaires were sent to two EDCs chosen at random. The questionnaires were sent by post, following telephone contact with the EDC managers. Findings – Factors determining access to information on the European Union and the frequency of this access are identified. The information providers most commonly used to access European information and the information sources considered the most reliable by respondents will also be analysed. Another area of analysis concerns the factors cited by respondents as facilitating access to information on Europe or, conversely, making it more difficult to access. Parallel to this, the aspects of accessing information on EU that are valued most by users will also be assessed. Research limitations/implications – Questionnaires had to be used, as the intention was to cover a very extensive geographical area. However, in opting for closed questions, it is acknowledged that standard responses have been obtained with no scope for capturing the individual circumstances of each respondent, thus making a qualitative approach difficult. Practical implications – The results provide an overall picture of certain aspects of the information behaviour of EDC users. They may serve as a starting point for planning training sessions designed to develop the skills required to search, access, evaluate and apply European information within an academic context. From a broader perspective, they also constitute factors which the European Commission should take into consideration when formulating its information and communication policy. Originality/value – This is the first piece of academic research into the EDCs and their users, which aimed to cover all Members State of the EU.
Resumo:
The kraft pulps produced from heartwood and sapwood of Eucalyptus globulus at 130 degrees C, 150 degrees C, and 170 degrees C were characterized by wet chemistry (total lignin as sum of Klason and soluble lignin fractions) and pyrolysis (total lignin denoted as py-lignin). The total lignin content obtained with both methods was similar. In the course of delignification, the py-lignin values were higher (by 2 to 5%) compared to Klason values, which is in line with the importance of soluble lignin for total lignin determination. Pyrolysis analysis presents advantages over wet chemical procedures, and it can be applied to wood and pulps to determine lignin contents at different stages of the delignification process. The py-lignin values were used for kinetic modelling of delignification, with very high predictive value and results similar to those of modelling using wet chemical determinations.
Resumo:
27th Annual Conference of the European Cetacean Society. Setúbal, Portugal, 8-10 April 2013.
Resumo:
Different problems are daily discuss on environmental aspects such acid rain, eutrophication, global warming and an others problems. Rarely do we find some discussions about phosphorus problematic. Through the years the phosphorus as been a real problem and must be more discussed. On this thesis was done a global material flow analysis of phosphorus, based on data from the year 2004, the production of phosphate rock in that year was 18.9 million tones, almost this amount it was used as fertilizer on the soil and the plants only can uptake, on average, 20% of the input of fertilizer to grow up, the remainder is lost for the phosphorus soil. In the phosphorus soil there is equilibrium between the phosphorus available to uptake from the plants and the phosphorus associate with other compounds, this equilibrium depends of the kind of soil and is related with the soil pH. A reserve inventory was done and we have 15,000 million tones as reserve, the amount that is economical available. The reserve base is estimated in 47,000 million tones. The major reserves can be found in Morocco and Western Sahara, United Sates, China and South Africa. The reserve estimated in 2009 was 15,000 million tone of phosphate rock or 1,963 million tone of P. If every year the mined phosphate rock is around 22 Mt/yr (phosphorus production on 2008 USGS 2009), and each year the consumption of phosphorus increases because of the food demand, the reserves of phosphate rock will be finished in about 90 years, or maybe even less. About the value/impact assessment was done a qualitative analysis, if on the future we don’t have more phosphate rock to produce fertilizers, it is expected a drop on the crops yields, each depends of the kind of the soil and the impact on the humans feed and animal production will not be a relevant problem. We can recovery phosphorus from different waste streams such as ploughing crop residues back into the soil, Food processing plants and food retailers, Human and animal excreta, Meat and bone meal, Manure fibre, Sewage sludge and wastewater. Some of these examples are developed in the paper.
Resumo:
Purpose - To compare the image quality and effective dose applying the 10 kVp rule with manual mode acquisition and AEC mode in PA chest X-ray. Method - 68 images (with and without lesions) were acquired using an anthropomorphic chest phantom using a Wolverson Arcoma X-ray unit. These images were compared against a reference image using the 2 alternative forced choice (2AFC) method. The effective dose (E) was calculated using PCXMC software using the exposure parameters and the DAP. The exposure index (lgM provided by Agfa systems) was recorded. Results - Exposure time decreases more when applying the 10 kVp rule with manual mode (50%–28%) when compared with automatic mode (36%–23%). Statistical differences for E between several ionization chambers' combinations for AEC mode were found (p = 0.002). E is lower when using only the right AEC ionization chamber. Considering the image quality there are no statistical differences (p = 0.348) between the different ionization chambers' combinations for AEC mode for images with no lesions. Considering lgM values, it was demonstrated that they were higher when the AEC mode was used compared to the manual mode. It was also observed that lgM values obtained with AEC mode increased as kVp value went up. The image quality scores did not demonstrate statistical significant differences (p = 0.343) for the images with lesions comparing manual with AEC mode. Conclusion - In general the E is lower when manual mode is used. By using the right AEC ionising chamber under the lung the E will be the lowest in comparison to other ionising chambers. The use of the 10 kVp rule did not affect the visibility of the lesions or image quality.
Resumo:
The premise of this paper is that a model for communicating the national value system must start from a strategy aimed at the identification, the cultivation and communication of values that give consistency to the value system. The analysis concentrates on the elements of such strategies and on the implications of applying a value communication program on the identity architecture of the community. The paper will also discuss the role of the national value system in the context of the emerging global culture, where the individual has the power to create his/her own hybrid cultural model.
Resumo:
On the basis of its electrochemical behaviour a new flow-injection analysis (FIA) method with amperometric detection has been developed for quantification of the herbicide bentazone (BTZ) in estuarine waters. Standard solutions and samples (200 µL) were injected into a water carrier stream and both pH and ionic strength were automatically adjusted inside the manifold. Optimization of critical FIA conditions indicated that the best analytical results were obtained at an oxidation potential of 1.10 V, pH 4.5, and an overall flow-rate of 2.4 mL min–1. Analysis of real samples was performed by means of calibration curves over the concentration range 2.5x10–6 to 5.0x10–5 mol L–1, and results were compared with those obtained by use of an independent method (HPLC). The accuracy of the amperometric determinations was ascertained; errors relative to the comparison method were below 4% and sampling rates were approximately 100 samples h–1. The repeatability of the proposed method was calculated by assessing the relative standard deviation (%) of ten consecutive determinations of one sample; the value obtained was 2.1%.
Resumo:
Mestrado em Auditoria
Resumo:
In previous works we have proposed a hybrid wired/wireless PROFIBUS solution where the interconnection between the heterogeneous media was accomplished through bridge-like devices with wireless stations being able to move between different wireless cells. Additionally, we had also proposed a worst-case timing analysis assuming that stations were stationary. In this paper we advance these previous works by proposing a worst-case timing analysis for the system’s message streams considering the effect of inter-cell mobility.