974 resultados para Aggregate Breakdown
Resumo:
2000 Mathematics Subject Classification: 62J12, 62F35
Resumo:
A cikk alapvető kérdése, hogy miképpen használható a tervezés a termelési folyamatok, s ezzel a vállalati m}uködés egészének hatékonyságnövelése érdekében. A termeléstervezés szintjei és eszközei közül a középtávú aggregált tervezésre koncentrálunk. Ennek oka elsősorban az, hogy tapasztalatunk szerinte tervezési szint gyakorlati alkalmazása még nem tekinthető elterjedtnek, s ebből következően az eszköz alaposabb ismerete és alkalmazásának elterjedése jelentős tartalékokat tárhat fel a m}uködési hatékonyság növelése terén. A dolgozat a termeléstervezés klasszikusnak tekinthető modelljét alkalmazza egy hazai vállalat esetében. Az elemzés során vizsgáljuk a modell alkalmazhatóságát és a különböző tervezési alternatívák hatását a hatékonyság növelésére. A modell számítógépes megoldását a Microsoft Excel Solver programjával végeztük. _______ The article demonstrates how production planning, especially aggregate production planning can positively influence the competitiveness of production firms. First the structure of production planning, different, but interconnected levels of it are introduced than the aggregate planning is elaborated in more details. Reason for focusing on aggregate planning lies in the fact that according to our experience aggregate planning is an operation planning method applied least of all production planning methods in Hungary. Due to this we are convinced that demonstrating a real case study in this area can help managers to realize that adopting it can significantly influence e±ciency in operation and represent important source of development. We applied a classic aggregate planning model for a Hungarian producing company. We have tested the adaptability of the model and also the effect of different concrete planning scenarios on efficiency. Solution of the mathematical model is calculated using the program of Microsoft Excel Solver.
Resumo:
A tanulmány abból indul ki, hogy a beruházási projektek értékelése során egyidejűleg szükséges figyelembe venni a projektben lekötött tőkét és a lekötési időt mint jövedelemtermelési lehetőséget. Definiálja a projekt aggregált tőkeigényének fogalmát és megszerkeszti a vonatkozó mérőszámot. Az aggregált tőkeigény új vállalatgazdasági kategória, mely a beruházási projektek értékelésének egy új megközelítését teszi lehetővé. A projekt aggregált tőkeigénye azt a tőkeösszeget jelenti, mely a projekt működtetéséhez annak teljes élettartama alatt szükséges. A három meghatározó tényező: a kezdőtőke, a megtérülési idő (illetőleg az élettartam) és a megtérülés gyorsasága. A számszerűsítéshez minden évre vonatkozóan meg kell határozni az adott évben lekötött tőkét, ami az adott évig még meg nem térült tőkerészt jelenti, majd ezek összegzése révén adódik az aggregált tőkeigény. A mértékegység egységnyi tőke egyévi lekötése. A tanulmány az összefüggések modellszerű levezetése mellett gazdag példaanyagot is tartalmaz. Az elemzés bővíti a nettó jelenérték tartalmára vonatkozó ismereteket, rávilágít az aggregált tőkeigény ismeretének fontosságára mind a nettó jelenérték, mind a belső kamatláb esetében. _____ The starting point of this paper is that in the evaluation process of investment projects necessary to take into account simultaneously the tied-up capital and tiedup time as the income-generating potential. For this, it defines a special content of aggregate capital needs of investment projects, and elaborates an index. The aggregate capital needs is a new business economics category, which provides a new aspect to evaluate investment projects. This means the amount of capital needed for the operation of the project during its full duration. Three factors determine the aggregate capital needs for investments projects. These are the amount of initial investment, the payback period (or the duration) and the rapidity of capital payback. The solution is to sum up the yearly tied-up capital, that is, the notreturned parts of the capital for each year. The measurement unit is one unit tied-up capital for one year. The paper formulates the main relationships as models and by way of explanation presents some examples. The analysis highlights the importance of considering the aggregate capital needs furthermore widens knowledge regarding the net present value and internal rate of return.
Resumo:
The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, μXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.
Resumo:
Glass is a common form of trace evidence found at many scenes of crimes in the form of small fragments. These glass fragments can transfer to surrounding objects and/or persons and may provide forensic investigators valuable information to link a suspect to the scene of a crime. Since the elemental composition of different glass sources can be very similar, a highly discriminating technique is required to distinguish between fragments that have originated from different sources. ^ The research presented here demonstrates that Laser Induced Breakdown Spectroscopy (LIBS) is a viable analytical technique for the association and discrimination of glass fragments. The first part of this research describes the optimization of the LIBS experiments including the use of different laser wavelengths to investigate laser-material interaction. The use of a 266 nm excitation laser provided the best analytical figures of merit with minimal damage to the sample. The resulting analytical figures of merit are presented. The second part of this research evaluated the sensitivity of LIBS to associate or discriminate float glass samples originating from the same manufacturing plants and produced at approximately the same time period. Two different sample sets were analyzed ranging in manufacturing dates from days to years apart. Eighteen (18) atomic emission lines corresponding to the elements Sr, K, Fe, Ca, Al, Ba, Na, Mg and Ti, were chosen because of their detection above the method detection limits and for presenting differences between the samples. Ten elemental ratios producing the most discrimination were selected for each set. When all the ratios are combined in a comparison, 99% of the possible pairs were discriminated using the optimized LIBS method generating typical analytical precisions of ∼5% RSD. ^ The final study consisted of the development of a new approach for the use of LIBS as a quantitative analysis of ultra-low volume solution analysis using aerosols and microdrops. Laser induced breakdown spectroscopy demonstrated to be an effective technique for the analysis of as low as 90 pL for microdrop LIBS with 1 pg absolute LOD and 20 µL for aerosol LIBS with an absolute LOD of ∼100 fg.^
Resumo:
The elemental analysis of soil is useful in forensic and environmental sciences. Methods were developed and optimized for two laser-based multi-element analysis techniques: laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and laser-induced breakdown spectroscopy (LIBS). This work represents the first use of a 266 nm laser for forensic soil analysis by LIBS. Sample preparation methods were developed and optimized for a variety of sample types, including pellets for large bulk soil specimens (470 mg) and sediment-laden filters (47 mg), and tape-mounting for small transfer evidence specimens (10 mg). Analytical performance for sediment filter pellets and tape-mounted soils was similar to that achieved with bulk pellets. An inter-laboratory comparison exercise was designed to evaluate the performance of the LA-ICP-MS and LIBS methods, as well as for micro X-ray fluorescence (μXRF), across multiple laboratories. Limits of detection (LODs) were 0.01-23 ppm for LA-ICP-MS, 0.25-574 ppm for LIBS, 16-4400 ppm for μXRF, and well below the levels normally seen in soils. Good intra-laboratory precision (≤ 6 % relative standard deviation (RSD) for LA-ICP-MS; ≤ 8 % for μXRF; ≤ 17 % for LIBS) and inter-laboratory precision (≤ 19 % for LA-ICP-MS; ≤ 25 % for μXRF) were achieved for most elements, which is encouraging for a first inter-laboratory exercise. While LIBS generally has higher LODs and RSDs than LA-ICP-MS, both were capable of generating good quality multi-element data sufficient for discrimination purposes. Multivariate methods using principal components analysis (PCA) and linear discriminant analysis (LDA) were developed for discriminations of soils from different sources. Specimens from different sites that were indistinguishable by color alone were discriminated by elemental analysis. Correct classification rates of 94.5 % or better were achieved in a simulated forensic discrimination of three similar sites for both LIBS and LA-ICP-MS. Results for tape-mounted specimens were nearly identical to those achieved with pellets. Methods were tested on soils from USA, Canada and Tanzania. Within-site heterogeneity was site-specific. Elemental differences were greatest for specimens separated by large distances, even within the same lithology. Elemental profiles can be used to discriminate soils from different locations and narrow down locations even when mineralogy is similar.
Resumo:
Variable Speed Limit (VSL) strategies identify and disseminate dynamic speed limits that are determined to be appropriate based on prevailing traffic conditions, road surface conditions, and weather conditions. This dissertation develops and evaluates a shockwave-based VSL system that uses a heuristic switching logic-based controller with specified thresholds of prevailing traffic flow conditions. The system aims to improve operations and mobility at critical bottlenecks. Before traffic breakdown occurrence, the proposed VSL’s goal is to prevent or postpone breakdown by decreasing the inflow and achieving uniform distribution in speed and flow. After breakdown occurrence, the VSL system aims to dampen traffic congestion by reducing the inflow traffic to the congested area and increasing the bottleneck capacity by deactivating the VSL at the head of the congested area. The shockwave-based VSL system pushes the VSL location upstream as the congested area propagates upstream. In addition to testing the system using infrastructure detector-based data, this dissertation investigates the use of Connected Vehicle trajectory data as input to the shockwave-based VSL system performance. Since the field Connected Vehicle data are not available, as part of this research, Vehicle-to-Infrastructure communication is modeled in the microscopic simulation to obtain individual vehicle trajectories. In this system, wavelet transform is used to analyze aggregated individual vehicles’ speed data to determine the locations of congestion. The currently recommended calibration procedures of simulation models are generally based on the capacity, volume and system-performance values and do not specifically examine traffic breakdown characteristics. However, since the proposed VSL strategies are countermeasures to the impacts of breakdown conditions, considering breakdown characteristics in the calibration procedure is important to have a reliable assessment. Several enhancements were proposed in this study to account for the breakdown characteristics at bottleneck locations in the calibration process. In this dissertation, performance of shockwave-based VSL is compared to VSL systems with different fixed VSL message sign locations utilizing the calibrated microscopic model. The results show that shockwave-based VSL outperforms fixed-location VSL systems, and it can considerably decrease the maximum back of queue and duration of breakdown while increasing the average speed during breakdown.
Resumo:
A criterion is suggested for discrimination between ferromanganese oxide minerals, deposited after the introduction of manganese and associated elements in sea water solution at submarine vulcanism, and minerals which are slowly formed from dilute solution, largely of continental origin. The simlultaneous injection of thorium into the ocean by submarine vulcanism is indicated, and its differentiation from continental thorium introduced into the ocean by runoff is discussed.
Resumo:
The demand side growth accounting studies the demand aggregate component contributions in the Gross Domestic Product (GDP). Traditionally, international and national organizations that uses the traditional method for calculating such contributions. However, this method does not take into account the effect the induction of imports by the various components of aggregate demand on the calculation of these. As an alternative to this method are presented others studies that consider this effect, as the alternative method proposed by Lara (2013), the attribution method, proposed by Kranendonk and Verbruggen (2005) and Hoekstra and van der Helm (2010), and the method the sraffian supermultiplier, by Freitas and Dweck (2013). Was made a summary of these methods, demonstrating the similarities and differences between them. Also, in the aim to contribute to the study of the subject was developed the “method of distribution of imports” that aims to distribute imports for the various components of aggregate demand, through the information set forth in the input-output matrices and tables of resources and uses. Were accounted the contributions to the growth of macroeconomic aggregates for Brazil from 2001 to 2009 using the method of distribution, and realized comparison with the traditional method, understanding the reasons for the differences in contributions. Later was done comparisons with all the methods presented in this work, between the calculated contributions to the growth of the components of aggregate demand and the domestic and external sectors. Was verified that the methods that exist in the literature was not enough to deal with this question, and given the alternatives for contributions to the growth presented throughout this work, it is believed that the method of distribution provides the best estimates for the account of contributions by aggregate demand sector. In particular, the main advantage of this method to the others is the breakdown of the contribution of imports, separated by aggregate demand component, which allows the analysis of contribution of each component to GDP growth. Thus, this type of analysis helps to study the pattern of growth of the Brazilian economy, not just the theoretical point of view, but also empirical and basis for the decision to economic policies
Resumo:
Studio della crescita della scarica elettrica nei gas a bassa pressione con trattazioni teoriche, risultati sperimentali e rimandi ai principali studi sull'argomento. Indice: 1 Introduzione 2 Breakdown nei Gas a Bassa Pressione 2.1 Criterio di Townsend e Legge di Paschen 2.2 Tensione di Breakdown in Campi Uniformi 2.3 Scostamenti dalla Legge di Paschen 2.4 Breakdown in Campi Non Uniformi 2.5 Time Lags per il Breakdown 2.6 Breakdown nel Vuoto 2.7 Scariche Intermittenti post-Breakdown 3 Breakdown nei Contatori Geiger 3.1 Contatori Geiger 3.2 Conteggio Proporzionale 3.3 Conteggio Non-Proporzionale 3.4 Propagazione della Scarica A Sviluppi Successivi
Resumo:
Email exchange in 2013 between Kathryn Maxson (Duke) and Kris Wetterstrand (NHGRI), regarding country funding and other data for the HGP sequencing centers. Also includes the email request for such information, from NHGRI to the centers, in 2000, and the aggregate data collected.
Resumo:
The analysis of white latex paint is a problem for forensic laboratories because of difficulty in differentiation between samples. Current methods provide limited information that is not suitable for discrimination. Elemental analysis of white latex paints has resulted in 99% discriminating power when using LA-ICP-MS; however, mass spectrometers can be prohibitively expensive and require a skilled operator. A quick, inexpensive, effective method is needed for the differentiation of white latex paints. In this study, LIBS is used to analyze 24 white latex paint samples. LIBS is fast, easy to operate, and has a low cost. Results show that 98.1% of variation can be accounted for via principle component analysis, while Tukey pairwise comparisons differentiated 95.6% with potassium as the elemental ratio, showing that the discrimination capabilities of LIBS are comparable to those of LA-ICP-MS. Due to the many advantages of LIBS, this instrument should be considered a necessity for forensic laboratories.
Resumo:
The necessity of elemental analysis techniques to solve forensic problems continues to expand as the samples collected from crime scenes grow in complexity. Laser ablation ICP-MS (LA-ICP-MS) has been shown to provide a high degree of discrimination between samples that originate from different sources. In the first part of this research, two laser ablation ICP-MS systems were compared, one using a nanosecond laser and another a femtosecond laser source for the forensic analysis of glass. The results showed that femtosecond LA-ICP-MS did not provide significant improvements in terms of accuracy, precision and discrimination, however femtosecond LA-ICP-MS did provide lower detection limits. In addition, it was determined that even for femtosecond LA-ICP-MS an internal standard should be utilized to obtain accurate analytical results for glass analyses. In the second part, a method using laser induced breakdown spectroscopy (LIBS) for the forensic analysis of glass was shown to provide excellent discrimination for a glass set consisting of 41 automotive fragments. The discrimination power was compared to two of the leading elemental analysis techniques, µXRF and LA-ICP-MS, and the results were similar; all methods generated >99% discrimination and the pairs found indistinguishable were similar. An extensive data analysis approach for LIBS glass analyses was developed to minimize Type I and II errors en route to a recommendation of 10 ratios to be used for glass comparisons. Finally, a LA-ICP-MS method for the qualitative analysis and discrimination of gel ink sources was developed and tested for a set of ink samples. In the first discrimination study, qualitative analysis was used to obtain 95.6% discrimination for a blind study consisting of 45 black gel ink samples provided by the United States Secret Service. A 0.4% false exclusion (Type I) error rate and a 3.9% false inclusion (Type II) error rate was obtained for this discrimination study. In the second discrimination study, 99% discrimination power was achieved for a black gel ink pen set consisting of 24 self collected samples. The two pairs found to be indistinguishable came from the same source of origin (the same manufacturer and type of pen purchased in different locations). It was also found that gel ink from the same pen, regardless of the age, was indistinguishable as were gel ink pens (four pens) originating from the same pack.
Resumo:
People recommenders are a widespread feature of social networking sites and educational social learning platforms alike. However, when these systems are used to extend learners’ Personal Learning Networks, they often fall short of providing recommendations of learning value to their users. This paper proposes a design of a people recommender based on content-based user profiles, and a matching method based on dissimilarity therein. It presents the results of an experiment conducted with curators of the content curation site Scoop.it!, where curators rated personalized recommendations for contacts. The study showed that matching dissimilarity of interpretations of shared interests is more successful in providing positive experiences of breakdown for the curator than is matching on similarity. The main conclusion of this paper is that people recommenders should aim to trigger constructive experiences of breakdown for their users, as the prospect and potential of such experiences encourage learners to connect to their recommended peers.