873 resultados para Mass based allocation
Resumo:
The main objective of this Master’s thesis is to develop a cost allocation model for a leading food industry company in Finland. The goal is to develop an allocation method for fixed overhead expenses produced in a specific production unit and create a plausible tracking system for product costs. The second objective is to construct an allocation model and modify the created model to be suited for other units as well. Costs, activities, drivers and appropriate allocation methods are studied. This thesis is started with literature review of existing theory of ABC, inspecting cost information and then conducting interviews with officials to get a general view of the requirements for the model to be constructed. The familiarization of the company started with becoming acquainted with the existing cost accounting methods. The main proposals for a new allocation model were revealed through interviews, which were utilized in setting targets for developing the new allocation method. As a result of this thesis, an Excel-based model is created based on the theoretical and empiric data. The new system is able to handle overhead costs in more detail improving the cost awareness, transparency in cost allocations and enhancing products’ cost structure. The improved cost awareness is received by selecting the best possible cost drivers for this situation. Also the capacity changes are taken into consideration, such as usage of practical or normal capacity instead of theoretical is suggested to apply. Also some recommendations for further development are made about capacity handling and cost collection.
Resumo:
Research on molecular mechanisms of carcinogenesis plays an important role in diagnosing and treating gastric cancer. Metabolic profiling may offer the opportunity to understand the molecular mechanism of carcinogenesis and help to non-invasively identify the potential biomarkers for the early diagnosis of human gastric cancer. The aims of this study were to explore the underlying metabolic mechanisms of gastric cancer and to identify biomarkers associated with morbidity. Gas chromatography/mass spectrometry (GC/MS) was used to analyze the serum metabolites of 30 Chinese gastric cancer patients and 30 healthy controls. Diagnostic models for gastric cancer were constructed using orthogonal partial least squares discriminant analysis (OPLS-DA). Acquired metabolomic data were analyzed by the nonparametric Wilcoxon test to find serum metabolic biomarkers for gastric cancer. The OPLS-DA model showed adequate discrimination between cancer and non-cancer cohorts while the model failed to discriminate different pathological stages (I-IV) of gastric cancer patients. A total of 44 endogenous metabolites such as amino acids, organic acids, carbohydrates, fatty acids, and steroids were detected, of which 18 differential metabolites were identified with significant differences. A total of 13 variables were obtained for their greatest contribution in the discriminating OPLS-DA model [variable importance in the projection (VIP) value >1.0], among which 11 metabolites were identified using both VIP values (VIP >1) and the Wilcoxon test. These metabolites potentially revealed perturbations of glycolysis and of amino acid, fatty acid, cholesterol, and nucleotide metabolism of gastric cancer patients. These results suggest that gastric cancer serum metabolic profiling has great potential in detecting this disease and helping to understand its metabolic mechanisms.
Resumo:
In order to determine the variability of pequi tree (Caryocar brasiliense Camb.) populations, volatile compounds from fruits of eighteen trees representing five populations were extracted by headspace solid-phase microextraction and analyzed by gas chromatography-mass spectrometry. Seventy-seven compounds were identified, including esters, hydrocarbons, terpenoids, ketones, lactones, and alcohols. Several compounds had not been previously reported in the pequi fruit. The amount of total volatile compounds and the individual compound contents varied between plants. The volatile profile enabled the differentiation of all of the eighteen plants, indicating that there is a characteristic profile in terms of their origin. The use of Principal Component Analysis and Cluster Analysis enabled the establishment of markers (dendrolasin, ethyl octanoate, ethyl 2-octenoate and β-cis-ocimene) that discriminated among the pequi trees. According to the Cluster Analysis, the plants were classified into three main clusters, and four other plants showed a tendency to isolation. The results from multivariate analysis did not always group plants from the same population together, indicating that there is greater variability within the populations than between pequi tree populations.
Resumo:
Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.
Resumo:
This paper focuses on one of the methods for bandwidth allocation in an ATM network: the convolution approach. The convolution approach permits an accurate study of the system load in statistical terms by accumulated calculations, since probabilistic results of the bandwidth allocation can be obtained. Nevertheless, the convolution approach has a high cost in terms of calculation and storage requirements. This aspect makes real-time calculations difficult, so many authors do not consider this approach. With the aim of reducing the cost we propose to use the multinomial distribution function: the enhanced convolution approach (ECA). This permits direct computation of the associated probabilities of the instantaneous bandwidth requirements and makes a simple deconvolution process possible. The ECA is used in connection acceptance control, and some results are presented
Resumo:
Summary 1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests. Keywords: bioenergetics; energy budget; individual-based models; population dynamics.
Resumo:
Scene classification based on latent Dirichlet allocation (LDA) is a more general modeling method known as a bag of visual words, in which the construction of a visual vocabulary is a crucial quantization process to ensure success of the classification. A framework is developed using the following new aspects: Gaussian mixture clustering for the quantization process, the use of an integrated visual vocabulary (IVV), which is built as the union of all centroids obtained from the separate quantization process of each class, and the usage of some features, including edge orientation histogram, CIELab color moments, and gray-level co-occurrence matrix (GLCM). The experiments are conducted on IKONOS images with six semantic classes (tree, grassland, residential, commercial/industrial, road, and water). The results show that the use of an IVV increases the overall accuracy (OA) by 11 to 12% and 6% when it is implemented on the selected and all features, respectively. The selected features of CIELab color moments and GLCM provide a better OA than the implementation over CIELab color moment or GLCM as individuals. The latter increases the OA by only ∼2 to 3%. Moreover, the results show that the OA of LDA outperforms the OA of C4.5 and naive Bayes tree by ∼20%. © 2014 Society of Photo-Optical Instrumentation Engineers (SPIE) [DOI: 10.1117/1.JRS.8.083690]
Resumo:
The study of decaying organisms and death assemblages is referred to as forensic taphonomy, or more simply the study of graves. This field is dominated by the fields of entomology, anthropology and archaeology. Forensic taphonomy also includes the study of the ecology and chemistry of the burial environment. Studies in forensic taphonomy often require the use of analogues for human cadavers or their component parts. These might include animal cadavers or skeletal muscle tissue. However, sufficient supplies of cadavers or analogues may require periodic freezing of test material prior to experimental inhumation in the soil. This study was carried out to ascertain the effect of freezing on skeletal muscle tissue prior to inhumation and decomposition in a soil environment under controlled laboratory conditions. Changes in soil chemistry were also measured. In order to test the impact of freezing, skeletal muscle tissue (Sus scrofa) was frozen (−20 °C) or refrigerated (4 °C). Portions of skeletal muscle tissue (∼1.5 g) were interred in microcosms (72 mm diameter × 120 mm height) containing sieved (2 mm) soil (sand) adjusted to 50% water holding capacity. The experiment had three treatments: control with no skeletal muscle tissue, microcosms containing frozen skeletal muscle tissue and those containing refrigerated tissue. The microcosms were destructively harvested at sequential periods of 2, 4, 6, 8, 12, 16, 23, 30 and 37 days after interment of skeletal muscle tissue. These harvests were replicated 6 times for each treatment. Microbial activity (carbon dioxide respiration) was monitored throughout the experiment. At harvest the skeletal muscle tissue was removed and the detritosphere soil was sampled for chemical analysis. Freezing was found to have no significant impact on decomposition or soil chemistry compared to unfrozen samples in the current study using skeletal muscle tissue. However, the interment of skeletal muscle tissue had a significant impact on the microbial activity (carbon dioxide respiration) and chemistry of the surrounding soil including: pH, electroconductivity, ammonium, nitrate, phosphate and potassium. This is the first laboratory controlled study to measure changes in inorganic chemistry in soil associated with the decomposition of skeletal muscle tissue in combination with microbial activity.
Resumo:
A Universal Serial Bus (USB) Mass Storage Device (MSD), often termed a USB flash drive, is ubiquitously used to store important information in unencrypted binary format. This low cost consumer device is incredibly popular due to its size, large storage capacity and relatively high transfer speed. However, if the device is lost or stolen an unauthorized person can easily retrieve all the information. Therefore, it is advantageous in many applications to provide security protection so that only authorized users can access the stored information. In order to provide security protection for a USB MSD, this paper proposes a session key agreement protocol after secure user authentication. The main aim of this protocol is to establish session key negotiation through which all the information retrieved, stored and transferred to the USB MSD is encrypted. This paper not only contributes an efficient protocol, but also does not suffer from the forgery attack and the password guessing attack as compared to other protocols in the literature. This paper analyses the security of the proposed protocol through a formal analysis which proves that the information is stored confidentially and is protected offering strong resilience to relevant security attacks. The computational cost and communication cost of the proposed scheme is analyzed and compared to related work to show that the proposed scheme has an improved tradeoff for computational cost, communication cost and security.
Resumo:
Background Underweight and severe and morbid obesity are associated with highly elevated risks of adverse health outcomes. We estimated trends in mean body-mass index (BMI), which characterises its population distribution, and in the prevalences of a complete set of BMI categories for adults in all countries. Methods We analysed, with use of a consistent protocol, population-based studies that had measured height and weight in adults aged 18 years and older. We applied a Bayesian hierarchical model to these data to estimate trends from 1975 to 2014 in mean BMI and in the prevalences of BMI categories (<18·5 kg/m2 [underweight], 18·5 kg/m2 to <20 kg/m2, 20 kg/m2 to <25 kg/m2, 25 kg/m2 to <30 kg/m2, 30 kg/m2 to <35 kg/m2, 35 kg/m2 to <40 kg/m2, ≥40 kg/m2 [morbid obesity]), by sex in 200 countries and territories, organised in 21 regions. We calculated the posterior probability of meeting the target of halting by 2025 the rise in obesity at its 2010 levels, if post-2000 trends continue. Findings We used 1698 population-based data sources, with more than 19·2 million adult participants (9·9 million men and 9·3 million women) in 186 of 200 countries for which estimates were made. Global age-standardised mean BMI increased from 21·7 kg/m2 (95% credible interval 21·3–22·1) in 1975 to 24·2 kg/m2 (24·0–24·4) in 2014 in men, and from 22·1 kg/m2 (21·7–22·5) in 1975 to 24·4 kg/m2 (24·2–24·6) in 2014 in women. Regional mean BMIs in 2014 for men ranged from 21·4 kg/m2 in central Africa and south Asia to 29·2 kg/m2 (28·6–29·8) in Polynesia and Micronesia; for women the range was from 21·8 kg/m2 (21·4–22·3) in south Asia to 32·2 kg/m2 (31·5–32·8) in Polynesia and Micronesia. Over these four decades, age-standardised global prevalence of underweight decreased from 13·8% (10·5–17·4) to 8·8% (7·4–10·3) in men and from 14·6% (11·6–17·9) to 9·7% (8·3–11·1) in women. South Asia had the highest prevalence of underweight in 2014, 23·4% (17·8–29·2) in men and 24·0% (18·9–29·3) in women. Age-standardised prevalence of obesity increased from 3·2% (2·4–4·1) in 1975 to 10·8% (9·7–12·0) in 2014 in men, and from 6·4% (5·1–7·8) to 14·9% (13·6–16·1) in women. 2·3% (2·0–2·7) of the world's men and 5·0% (4·4–5·6) of women were severely obese (ie, have BMI ≥35 kg/m2). Globally, prevalence of morbid obesity was 0·64% (0·46–0·86) in men and 1·6% (1·3–1·9) in women. Interpretation If post-2000 trends continue, the probability of meeting the global obesity target is virtually zero. Rather, if these trends continue, by 2025, global obesity prevalence will reach 18% in men and surpass 21% in women; severe obesity will surpass 6% in men and 9% in women. Nonetheless, underweight remains prevalent in the world's poorest regions, especially in south Asia.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The deconvolution of the voltammograms of polypyrrole electrochemistry has proved to be possible through the electrochemical quartz crystal microbalance data using the F(dm/dQ) function. This deconvolution allows the evolution of the thickness of the polypyrrole films during their redox processes to be estimated and therefore, the mechanical contraction/decontraction of this polymer as a function of the ionic exchange processes can be evaluated. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)