971 resultados para Fast Computation Algorithm
Resumo:
12 x 20 cm
Resumo:
12 x 20 cm
Resumo:
The purpose of this thesis was to study the design of demand forecasting processes. A literature review in the field of forecasting was conducted, including general forecasting process design, forecasting methods and techniques, the role of human judgment in forecasting and forecasting performance measurement. The purpose of the literature review was to identify the important design choices that an organization aiming to design or re-design their demand forecasting process would have to make. In the empirical part of the study, these choices and the existing knowledge behind them was assessed in a case study where a demand forecasting process was re-designed for a company in the fast moving consumer goods business. The new target process is described, as well as the reasoning behind the design choices made during the re-design process. As a result, the most important design choices are highlighted, as well as their immediate effect on other processes directly tied to the demand forecasting process. Additionally, some new insights on the organizational aspects of demand forecasting processes are explored. The preliminary results indicate that in this case the new process did improve forecasting accuracy, although organizational issues related to the process proved to be more challenging than anticipated.
Resumo:
The birth of Internet technologies, the developments of fast fashion and multiple retailing channels have created a need for a new, more integrated way for doing retailing. Agility in fast fashion retailing could be seen as a significant way of responding to these changes and furthermore, as a way to respond to consumers’ altering demands. The purpose of this study was to explore the ways in which agile supply chains and integrated multichannel retailing influence the international fast fashion retailing. A framework for agility in retail was developed based on available theoretical considerations in distribution and communication channels. Qualitative research methods and qualitative content analysis were used. Four expert interviews were carried out to gain new perspectives to the objectives. The rest of the data was collected from an industry specific document, expert video and two expert lectures. Following the data collection, the research material was analyzed with qualitative content analysis. The empirical findings on agility in retail were presented based on a coding frame. It was found that agility in retail has multiple parts, which are overlapping and affecting one another. Furthermore, instead of viewing the agile supply chain and integrated multichannel retailing separately of each other as usual, it was found that they should be integrated, and the term “agility” was proposed to denote this approach. Also, it was found that the most common drivers and constrains of integrated multichannel retailing were the new Internet technologies and customer demand. Brick-and-mortar store, online store, mobile devices and social media were found to be the most common retailing channels. Furthermore, in-store technology, click-and-collect approach, NFC-buying, RFID-technology as well as 3D- digital simulations on fabrics and patterns will enhance agility even more in the future. In addition, environmental issues, customer experiences and communication will be important factors. This study has provided new practical insights for the future retailing. Furthermore, it has contributed to the academic research by discussing the traditional approaches of agility in fast fashion retail and bringing in new insights.
Resumo:
Today the limitedness of fossil fuel resources is clearly realized. For this reason there is a strong focus throughout the world on shifting from fossil fuel based energy system to biofuel based energy system. In this respect Finland with its proven excellent forestry capabilities has a great potential to accomplish this goal. It is regarded that one of the most efficient ways of wood biomass utilization is to use it as a feedstock for fast pyrolysis process. By means of this process solid biomass is converted into liquid fuel called bio-oil which can be burnt at power plants, used for hydrogen generation through a catalytic steam reforming process and as a source of valuable chemical compounds. Nowadays different configurations of this process have found their applications in several pilot plants worldwide. However the circulating fluidized bed configuration is regarded as the one with the highest potential to be commercialized. In the current Master’s Thesis a feasibility study of circulating fluidized bed fast pyrolysis process utilizing Scots pine logs as a raw material was conducted. The production capacity of the process is 100 000 tonne/year of bio-oil. The feasibility study is divided into two phases: a process design phase and economic feasibility analysis phase. The process design phase consists of mass and heat balance calculations, equipment sizing, estimation of pressure drops in the pipelines and development of plant layout. This phase resulted in creation of process flow diagrams, equipment list and Microsoft Excel spreadsheet that calculates the process mass and heat balances depending on the bio-oil production capacity which can be set by a user. These documents are presented in the current report as appendices. In the economic feasibility analysis phase there were at first calculated investment and operating costs of the process. Then using these costs there was calculated the price of bio-oil which is required to reach the values of internal rate of return of 5%, 10%, 20%, 30%, 40%, and 50%.
Resumo:
The determination of the intersection curve between Bézier Surfaces may be seen as the composition of two separated problems: determining initial points and tracing the intersection curve from these points. The Bézier Surface is represented by a parametric function (polynomial with two variables) that maps a point in the tridimensional space from the bidimensional parametric space. In this article, it is proposed an algorithm to determine the initial points of the intersection curve of Bézier Surfaces, based on the solution of polynomial systems with the Projected Polyhedral Method, followed by a method for tracing the intersection curves (Marching Method with differential equations). In order to allow the use of the Projected Polyhedral Method, the equations of the system must be represented in terms of the Bernstein basis, and towards this goal it is proposed a robust and reliable algorithm to exactly transform a multivariable polynomial in terms of power basis to a polynomial written in terms of Bernstein basis .
Resumo:
In this paper we present an algorithm for the numerical simulation of the cavitation in the hydrodynamic lubrication of journal bearings. Despite the fact that this physical process is usually modelled as a free boundary problem, we adopted the equivalent variational inequality formulation. We propose a two-level iterative algorithm, where the outer iteration is associated to the penalty method, used to transform the variational inequality into a variational equation, and the inner iteration is associated to the conjugate gradient method, used to solve the linear system generated by applying the finite element method to the variational equation. This inner part was implemented using the element by element strategy, which is easily parallelized. We analyse the behavior of two physical parameters and discuss some numerical results. Also, we analyse some results related to the performance of a parallel implementation of the algorithm.
Resumo:
This work presents recent results concerning a design methodology used to estimate the positioning deviation for a gantry (Cartesian) manipulator, related mainly to structural elastic deformation of components during operational conditions. The case-study manipulator is classified as gantry type and its basic dimensions are 1,53m x 0,97m x 1,38m. The dimensions used for the calculation of effective workspace due to end-effector path displacement are: 1m x 0,5m x 0,5m. The manipulator is composed by four basic modules defined as module X, module Y, module Z and terminal arm, where is connected the end-effector. Each module controlled axis performs a linear-parabolic positioning movement. The planning path algorithm has the maximum velocity and the total distance as input parameters for a given task. The acceleration and deceleration times are the same. Denavit-Hartemberg parameterization method is used in the manipulator kinematics model. The gantry manipulator can be modeled as four rigid bodies with three degrees-of-freedom in translational movements, connected as an open kinematics chain. Dynamic analysis were performed considering inertial parameters specification such as component mass, inertia and center of gravity position of each module. These parameters are essential for a correct manipulator dynamic modelling, due to multiple possibilities of motion and manipulation of objects with different masses. The dynamic analysis consists of a mathematical modelling of the static and dynamic interactions among the modules. The computation of the structural deformations uses the finite element method (FEM).
Resumo:
Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.
Resumo:
Today’s electrical machine technology allows increasing the wind turbine output power by an order of magnitude from the technology that existed only ten years ago. However, it is sometimes argued that high-power direct-drive wind turbine generators will prove to be of limited practical importance because of their relatively large size and weight. The limited space for the generator in a wind turbine application together with the growing use of wind energy pose a challenge for the design engineers who are trying to increase torque without making the generator larger. When it comes to high torque density, the limiting factor in every electrical machine is heat, and if the electrical machine parts exceed their maximum allowable continuous operating temperature, even for a short time, they can suffer permanent damage. Therefore, highly efficient thermal design or cooling methods is needed. One of the promising solutions to enhance heat transfer performances of high-power, low-speed electrical machines is the direct cooling of the windings. This doctoral dissertation proposes a rotor-surface-magnet synchronous generator with a fractional slot nonoverlapping stator winding made of hollow conductors, through which liquid coolant can be passed directly during the application of current in order to increase the convective heat transfer capabilities and reduce the generator mass. This doctoral dissertation focuses on the electromagnetic design of a liquid-cooled direct-drive permanent-magnet synchronous generator (LC DD-PMSG) for a directdrive wind turbine application. The analytical calculation of the magnetic field distribution is carried out with the ambition of fast and accurate predicting of the main dimensions of the machine and especially the thickness of the permanent magnets; the generator electromagnetic parameters as well as the design optimization. The focus is on the generator design with a fractional slot non-overlapping winding placed into open stator slots. This is an a priori selection to guarantee easy manufacturing of the LC winding. A thermal analysis of the LC DD-PMSG based on a lumped parameter thermal model takes place with the ambition of evaluating the generator thermal performance. The thermal model was adapted to take into account the uneven copper loss distribution resulting from the skin effect as well as the effect of temperature on the copper winding resistance and the thermophysical properties of the coolant. The developed lumpedparameter thermal model and the analytical calculation of the magnetic field distribution can both be integrated with the presented algorithm to optimize an LC DD-PMSG design. Based on an instrumented small prototype with liquid-cooled tooth-coils, the following targets have been achieved: experimental determination of the performance of the direct liquid cooling of the stator winding and validating the temperatures predicted by an analytical thermal model; proving the feasibility of manufacturing the liquid-cooled tooth-coil winding; moreover, demonstration of the objectives of the project to potential customers.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
The purpose of this study is to examine whether Corporate Social Responsibility (CSR) announcements of the three biggest American fast food companies (McDonald’s, YUM! Brands and Wendy’s) have any effect on their stock returns as well as on the returns of the industry index (Dow Jones Restaurants and Bars). The time period under consideration starts on 1st of May 2001 and ends on 17th of October 2013. The stock market reaction is tested with an event study utilizing CAPM. The research employs the daily stock returns of the companies, the index and the benchmarks (NASDAQ and NYSE). The test of combined announcements did not reveal any significant effect on the index and McDonald’s. However the stock returns of Wendy’s and YUM! Brands reacted negatively. Moreover, the company level analyses showed that to their own CSR releases McDonald’s stock returns respond positively, YUM! Brands reacts negatively and Wendy’s does not have any reaction. Plus, it was found that the competitors of the announcing company tend to react negatively to all the events. Furthermore, the division of the events into sustainability categories showed statistically significant negative reaction from the Index, McDonald’s and YUM! Brands towards social announcements. At the same time only the index was positively affected by to the economic and environmental CSR news releases.
Resumo:
This thesis presents a framework for segmentation of clustered overlapping convex objects. The proposed approach is based on a three-step framework in which the tasks of seed point extraction, contour evidence extraction, and contour estimation are addressed. The state-of-art techniques for each step were studied and evaluated using synthetic and real microscopic image data. According to obtained evaluation results, a method combining the best performers in each step was presented. In the proposed method, Fast Radial Symmetry transform, edge-to-marker association algorithm and ellipse fitting are employed for seed point extraction, contour evidence extraction and contour estimation respectively. Using synthetic and real image data, the proposed method was evaluated and compared with two competing methods and the results showed a promising improvement over the competing methods, with high segmentation and size distribution estimation accuracy.
Resumo:
It has been shown for several DNA probes that the recently introduced Fast-FISH (fluorescence in situ hybridization) technique is well suited for quantitative microscopy. For highly repetitive DNA probes the hybridization (renaturation) time and the number of subsequent washing steps were reduced considerably by omitting denaturing chemical agents (e.g., formamide). The appropriate hybridization temperature and time allow a clear discrimination between major and minor binding sites by quantitative fluorescence microscopy. The well-defined physical conditions for hybridization permit automatization of the procedure, e.g., by a programmable thermal cycler. Here, we present optimized conditions for a commercially available X-specific a-satellite probe. Highly fluorescent major binding sites were obtained for 74oC hybridization temperature and 60 min hybridization time. They were clearly discriminated from some low fluorescent minor binding sites on metaphase chromosomes as well as in interphase cell nuclei. On average, a total of 3.43 ± 1.59 binding sites were measured in metaphase spreads, and 2.69 ± 1.00 in interphase nuclei. Microwave activation for denaturation and hybridization was tested to accelerate the procedure. The slides with the target material and the hybridization buffer were placed in a standard microwave oven. After denaturation for 20 s at 900 W, hybridization was performed for 4 min at 90 W. The suitability of a microwave oven for Fast-FISH was confirmed by the application to a chromosome 1-specific a-satellite probe. In this case, denaturation was performed at 630 W for 60 s and hybridization at 90 W for 5 min. In all cases, the results were analyzed quantitatively and compared to the results obtained by Fast-FISH. The major binding sites were clearly discriminated by their brightness
Resumo:
To study the effect of halothane as a cardioplegic agent, ten Wistar rats were anesthetized by ether inhalation and their hearts were perfused in a Langendorff system with Krebs-Henseleit solution (36oC; 90 cm H2O pressure). After a 15-min period for stabilization the control values for heart rate, force (T), dT/dt and coronary flow were recorded and a halothane-enriched solution (same temperature and pressure) was perfused until cardiac arrest was obtained. The same Krebs-Henseleit solution was reperfused again and the parameters studied were recorded after 1, 3, 5, 10, 20 and 30 min. Cardiac arrest occurred in all hearts during the first two min of perfusion with halothane-bubbled solution. One minute after reperfusion without halothane, the following parameters reported in terms of control values were obtained: 90.5% of control heart rate (266.9 ± 43.4 to 231.5 ± 71.0 bpm), 20.2% of the force (1.83 ± 0.28 to 0.37 ± 0.25 g), 19.8% of dT/dt (46.0 ± 7.0 to 9.3 ± 6.0 g/s) and 90.8% of coronary flow (9.9 ± 1.5 to 9.4 ± 1.5 ml/min). After 3 min of perfusion they changed to 99.0% heart rate (261.0 ± 48.2), 98.9% force (1.81 ± 0.33), 98.6 dT/dt (45.0 ± 8.2) and 94.8% coronary flow (9.3 ± 1.4). At 5 min 100.8% (267.0 ± 40.6) heart rate, 105.0% (1.92 ± 0.29) force and 104.4% (48.2 ± 7.2) dT/dt were recorded and maintained without significant differences (P>0.01) until the end of the experiment. These data demonstrate that volatile cardioplegia with halothane is an effective technique for fast induction of and prompt recovery from normothermic cardiac arrest of the rat heart