61 resultados para socioeconomic level
Resumo:
The purpose of this dissertation is to increase the understanding and knowledge of field sales management control systems (i.e. sales managers monitoring, directing, evaluating and rewarding activities) and their potential consequences on salespeople. This topic is important because research conducted in the past has indicated that the choice of control system type can on the other hand have desirable consequences, such as high levels of motivation and performance, and on the other hand leadto harmful unintended consequences, such as opportunistic or unethical behaviors. Despite the fact that marketing and sales management control systems have been under rigorous research for over two decades, it still is at a very early stage of development, and several inconsistencies can be found in the research results. This dissertation argues that these inconsistencies are mainly derived from misspecification of the level of analysis in the past research. These different levels of analysis (i.e. strategic, tactical, and operational levels) involve very different decision-making situations regarding the control and motivation of sales force, which should be taken into consideration when conceptualizing the control. Moreover, the study of salesperson consequences of a field sales management control system is actually a cross-level phenomenon, which means that at least two levels of analysis are simultaneously involved. The results of this dissertation confirm the need to re-conceptualize the field sales management control system concept. It provides empirical evidence for the assertion that control should be conceptualized with more details atthe tactical/operational level of analysis than at the strategic levelof analysis. Moreover, the results show that some controls are more efficiently communicated to field salespeople than others. It is proposed that this difference is due to different purposes of control; some controls aredesigned for influencing salespersons' behavior (aim at motivating) whereas some controls are designed to aid decision-making (aim at providing information). According to the empirical results of this dissertation, the both types of controls have an impact to the sales force, but this impactis not as strong as expected. The results obtained in this dissertation shed some light to the nature of field sales management control systems, and their consequences on salespeopl
Resumo:
Productivity and profitability are important concepts and measures describing the performance and success of a firm. We know that increase in productivity decreases the costs per unit produced and leads to better profitability. This common knowledge is not, however, enough in the modern business environment. Productivity improvement is one means among others for increasing the profitability of actions. There are many means to increase productivity. The use of these means presupposes operative decisions and these decisions presuppose informationabout the effects of these means. Productivity improvement actions are in general made at floor level with machines, cells, activities and human beings. Profitability is most meaningful at the level of the whole firm. It has been very difficult or even impossible to analyze closely enough the economical aspects of thechanges at floor level with the traditional costing systems. New ideas in accounting have only recently brought in elements which make it possible to considerthese phenomena where they actually happen. The aim of this study is to supportthe selection of objects to productivity improvement, and to develop a method to analyze the effects of the productivity change in an activity on the profitability of a firm. A framework for systemizing the economical management of productivity improvement is developed in this study. This framework is a systematical way with two stages to analyze the effects of productivity improvement actions inan activity on the profitability of a firm. At the first stage of the framework, a simple selection method which is based on the worth, possibility and the necessity of the improvement actions in each activity is presented. This method is called Urgency Analysis. In the second stage it is analyzed how much a certain change of productivity in an activity affects the profitability of a firm. A theoretical calculation model with which it is possible to analyze the effects of a productivity improvement in monetary values is presented. On the basis of this theoretical model a tool is made for the analysis at the firm level. The usefulness of this framework was empirically tested with the data of the profit center of one medium size Finnish firm which operates in metal industry. It is expressedthat the framework provides valuable information about the economical effects of productivity improvement for supporting the management in their decision making.
Resumo:
This thesis studies gray-level distance transforms, particularly the Distance Transform on Curved Space (DTOCS). The transform is produced by calculating distances on a gray-level surface. The DTOCS is improved by definingmore accurate local distances, and developing a faster transformation algorithm. The Optimal DTOCS enhances the locally Euclidean Weighted DTOCS (WDTOCS) with local distance coefficients, which minimize the maximum error from the Euclideandistance in the image plane, and produce more accurate global distance values.Convergence properties of the traditional mask operation, or sequential localtransformation, and the ordered propagation approach are analyzed, and compared to the new efficient priority pixel queue algorithm. The Route DTOCS algorithmdeveloped in this work can be used to find and visualize shortest routes between two points, or two point sets, along a varying height surface. In a digital image, there can be several paths sharing the same minimal length, and the Route DTOCS visualizes them all. A single optimal path can be extracted from the route set using a simple backtracking algorithm. A new extension of the priority pixel queue algorithm produces the nearest neighbor transform, or Voronoi or Dirichlet tessellation, simultaneously with the distance map. The transformation divides the image into regions so that each pixel belongs to the region surrounding the reference point, which is nearest according to the distance definition used. Applications and application ideas for the DTOCS and its extensions are presented, including obstacle avoidance, image compression and surface roughness evaluation.
Resumo:
This thesis gives an overview of the use of the level set methods in the field of image science. The similar fast marching method is discussed for comparison, also the narrow band and the particle level set methods are introduced. The level set method is a numerical scheme for representing, deforming and recovering structures in an arbitrary dimensions. It approximates and tracks the moving interfaces, dynamic curves and surfaces. The level set method does not define how and why some boundary is advancing the way it is but simply represents and tracks the boundary. The principal idea of the level set method is to represent the N dimensional boundary in the N+l dimensions. This gives the generality to represent even the complex boundaries. The level set methods can be powerful tools to represent dynamic boundaries, but they can require lot of computing power. Specially the basic level set method have considerable computational burden. This burden can be alleviated with more sophisticated versions of the level set algorithm like the narrow band level set method or with the programmable hardware implementation. Also the parallel approach can be used in suitable applications. It is concluded that these methods can be used in a quite broad range of image applications, like computer vision and graphics, scientific visualization and also to solve problems in computational physics. Level set methods and methods derived and inspired by it will be in the front line of image processing also in the future.
Resumo:
Tutkimuksen tarkoituksena oli selvittää, ovatko asiakkaiden kanssa sovitut kaupintavarastojärjestelyt liiketoiminnallisesti oikeutettuja vai ovatko ne pääasiassa vain asiakkaan palvelua. Tavoitteiksi asetettiin tutkia logististen palveluiden tärkeyttä, etsiä eri näkökohtia asiakaskohtaisten varastojen ja varastopalveluiden tarjoamiselle, löytää mahdollisia korvaavia tai vastaavia palveluita, sekä esitellä menetelmä toimitusketjun palvelutason priorisoimiseksi ja hallitsemiseksi. Työhön tarvittu kaupallinen aineisto koskien tuotteita ja asiakaita saatiin yrityksen tietojärjestelmästä. Muut taustatiedot kerättiin yrityksen toimintaohjeista, keskustelemalla ja haastattelemalla henkilöstöä, sekä seuraamalla yrityksen toimintaa työnteon ohessa. Tutkimus on jaettu kirjallisuus- ja case-osuuteen. Työn case-tulokset osoittavat, että nykyiset kaupintavarastoasiakkaat pääosin ansaitsevat lisäarvoa tuottavan varastopalvelun. Tutkimuksesta saadut kokemukset ja tiedot auttavat tekemään lisäarviointeja, koskien myös yrityksen muita tuotantolaitoksia. Työstä saatuja kokonaishyötyjä on vielä tässä vaiheessa vaikea arvioida.
Resumo:
Tämän tutkintotyön tavoitteena on selvittää operatiivista ostotoimintaa, joka sisältää oikea-aikaisen tilausrytmin ja tasapainoisen tilausmäärän määrityksen sekä saapuvan tavaravirran mukauttamisen myyntiin tai kulutukseen kokonaisuutena. Tässä tutkimuksessa tarkastellaan myös varastojen merkitystä ja ostotoiminnan kannalta keskeisimpiä tehokkaaseen varastonohjaukseen liittyviä tunnuslukuja. Tutkimus sisältää lääkkeiden toimitusketjun kuvauksen, koska se poikkeaa merkittävästi muista toimialoista. Tämä tutkintotyö on syntetisoiva kirjallisuustutkimus. Tutkintotyön empiirisessä osassa analysoidaan aluksi case-yrityksen vaihto-omaisuusvaraston ohjausta ostotoiminnan näkökulmasta ABC analyysiin pohjautuen. Tämän jälkeen ostotoimintaa analysoidaan tarkemmin tiettyjen toimittajien osalta. Lopuksi laaditaan optimaalinen tilausrytmi ja tilausmäärät sekä ostobudjetti yhden toimittajan tuotteille. ABC analyysia voidaan hyödyntää työkaluna määriteltäessä miten erilaisten tuotteiden materiaalivirtoja tulisi ostotoiminnan kannalta ohjata. Analyysi perustuu resurssien keskittämiseen sinne missä tuotto on suurin.
Resumo:
As the development of integrated circuit technology continues to follow Moore’s law the complexity of circuits increases exponentially. Traditional hardware description languages such as VHDL and Verilog are no longer powerful enough to cope with this level of complexity and do not provide facilities for hardware/software codesign. Languages such as SystemC are intended to solve these problems by combining the powerful expression of high level programming languages and hardware oriented facilities of hardware description languages. To fully replace older languages in the desing flow of digital systems SystemC should also be synthesizable. The devices required by modern high speed networks often share the same tight constraints for e.g. size, power consumption and price with embedded systems but have also very demanding real time and quality of service requirements that are difficult to satisfy with general purpose processors. Dedicated hardware blocks of an application specific instruction set processor are one way to combine fast processing speed, energy efficiency, flexibility and relatively low time-to-market. Common features can be identified in the network processing domain making it possible to develop specialized but configurable processor architectures. One such architecture is the TACO which is based on transport triggered architecture. The architecture offers a high degree of parallelism and modularity and greatly simplified instruction decoding. For this M.Sc.(Tech) thesis, a simulation environment for the TACO architecture was developed with SystemC 2.2 using an old version written with SystemC 1.0 as a starting point. The environment enables rapid design space exploration by providing facilities for hw/sw codesign and simulation and an extendable library of automatically configured reusable hardware blocks. Other topics that are covered are the differences between SystemC 1.0 and 2.2 from the viewpoint of hardware modeling, and compilation of a SystemC model into synthesizable VHDL with Celoxica Agility SystemC Compiler. A simulation model for a processor for TCP/IP packet validation was designed and tested as a test case for the environment.
Resumo:
Adolescence is an important time for acquiring high peak bone mass. Physical activity is known to be beneficial to bone development. The effect of estrogen-progestin contraceptives (EPC) is still controversial. Altogether 142 (52 gymnasts, 46 runners, and 42 controls) adolescent women participated in this study, which is based on two 7-year (n =142), one 6-year (n =140) and one 4-year (n =122) follow-ups. Information on physical activity, menstrual history, sexual maturation, nutrition, living habits and health status was obtained through questionnaires and interviews. The bone mineral density (BMD) and content (BMC) of lumbar spine (LS) and femoral neck (FN) were measured by dual- energy X-ray absoptiometry. Calcaneal sonographic measurements were also made. The physical activity of the athletes participating in this study decreased after 3-year follow-up. High-impact exercise was beneficial to bones. LS and FN BMC was higher in gymnasts than in controls during the follow-up. Reduction in physical activity had negative effects on bone mass. LS and FN BMC increased less in the group having reduced their physical activity more than 50%, compared with those continuing at the previous level (1.69 g, p=0.021; 0.14 g, p=0.015, respectively). The amount of physical activity was the only significant parameter accounting for the calcaneal sonography measurements at 6-year follow-up (11.3%) and reduced activity level was associated with lower sonographic values. Long-term low-dose EPC use seemed to prevent normal bone mass acquisition. There was a significant trend towards a smaller increase in LS and FN BMC among long-term EPC users. In conclusion, this study confirms that high-impact exercise is beneficial to bones and that the benefits are partly maintained even after a clear reduction in training level at least for 4 years. Continued exercise is needed to retain all acquired benefits. The bone mass gained and maintained can possibly be maximized in adolescence by implementing high-impact exercise for youngsters. The peak bone mass of the young women participating in the study may be reached before the age of 20. Use of low-dose EPCs seems to suppress normal bone mass acquisition.
Resumo:
This thesis deals with distance transforms which are a fundamental issue in image processing and computer vision. In this thesis, two new distance transforms for gray level images are presented. As a new application for distance transforms, they are applied to gray level image compression. The new distance transforms are both new extensions of the well known distance transform algorithm developed by Rosenfeld, Pfaltz and Lay. With some modification their algorithm which calculates a distance transform on binary images with a chosen kernel has been made to calculate a chessboard like distance transform with integer numbers (DTOCS) and a real value distance transform (EDTOCS) on gray level images. Both distance transforms, the DTOCS and EDTOCS, require only two passes over the graylevel image and are extremely simple to implement. Only two image buffers are needed: The original gray level image and the binary image which defines the region(s) of calculation. No other image buffers are needed even if more than one iteration round is performed. For large neighborhoods and complicated images the two pass distance algorithm has to be applied to the image more than once, typically 3 10 times. Different types of kernels can be adopted. It is important to notice that no other existing transform calculates the same kind of distance map as the DTOCS. All the other gray weighted distance function, GRAYMAT etc. algorithms find the minimum path joining two points by the smallest sum of gray levels or weighting the distance values directly by the gray levels in some manner. The DTOCS does not weight them that way. The DTOCS gives a weighted version of the chessboard distance map. The weights are not constant, but gray value differences of the original image. The difference between the DTOCS map and other distance transforms for gray level images is shown. The difference between the DTOCS and EDTOCS is that the EDTOCS calculates these gray level differences in a different way. It propagates local Euclidean distances inside a kernel. Analytical derivations of some results concerning the DTOCS and the EDTOCS are presented. Commonly distance transforms are used for feature extraction in pattern recognition and learning. Their use in image compression is very rare. This thesis introduces a new application area for distance transforms. Three new image compression algorithms based on the DTOCS and one based on the EDTOCS are presented. Control points, i.e. points that are considered fundamental for the reconstruction of the image, are selected from the gray level image using the DTOCS and the EDTOCS. The first group of methods select the maximas of the distance image to new control points and the second group of methods compare the DTOCS distance to binary image chessboard distance. The effect of applying threshold masks of different sizes along the threshold boundaries is studied. The time complexity of the compression algorithms is analyzed both analytically and experimentally. It is shown that the time complexity of the algorithms is independent of the number of control points, i.e. the compression ratio. Also a new morphological image decompression scheme is presented, the 8 kernels' method. Several decompressed images are presented. The best results are obtained using the Delaunay triangulation. The obtained image quality equals that of the DCT images with a 4 x 4
Resumo:
The article describes some concrete problems that were encountered when writing a two-level model of Mari morphology. Mari is an agglutinative Finno-Ugric language spoken in Russia by about 600 000 people. The work was begun in the 1980s on the basis of K. Koskenniemi’s Two-Level Morphology (1983), but in the latest stage R. Beesley’s and L. Karttunen’s Finite State Morphology (2003) was used. Many of the problems described in the article concern the inexplicitness of the rules in Mari grammars and the lack of information about the exact distribution of some suffixes, e.g. enclitics. The Mari grammars usually give complete paradigms for a few unproblematic verb stems, whereas the difficult or unclear forms of certain verbs are only superficially discussed. Another example of phenomena that are poorly described in grammars is the way suffixes with an initial sibilant combine to stems ending in a sibilant. The help of informants and searches from electronic corpora were used to overcome such difficulties in the development of the two-level model of Mari. The variation of the order of plural markers, case suffixes and possessive suffixes is a typical feature of Mari. The morphotactic rules constructed for Mari declensional forms tend to be recursive and their productivity must be limited by some technical device, such as filters. In the present model, certain plural markers were treated like nouns. The positional and functional versatility of the possessive suffixes can be regarded as the most challenging phenomenon in attempts to formalize the Mari morphology. Cyrillic orthography, which was used in the model, also caused problems. For instance, a Cyrillic letter may represent a sequence of two sounds, the first being part of the word stem while the other belongs to a suffix. In some cases, letters for voiced consonants are also generalized to represent voiceless consonants. Such orthographical conventions distance a morphological model based on orthography from the actual (morpho)phonological processes in the language.
Resumo:
Neuropeptide Y (NPY) is a widely expressed neurotransmitter in the central and peripheral nervous systems. Thymidine 1128 to cytocine substitution in the signal sequence of the preproNPY results in a single amino acid change where leucine is changed to proline. This L7P change leads to a conformational change of the signal sequence which can have an effect on the intracellular processing of NPY. The L7P polymorphism was originally associated with higher total and LDL cholesterol levels in obese subjects. It has also been associated with several other physiological and pathophysiological responses such as atherosclerosis and T2 diabetes. However, the changes on the cellular level due to the preproNPY signal sequence L7P polymorphism were not known. The aims of the current thesis were to study the effects of the [p.L7]+[p.L7] and the [p.L7]+[p.P7] genotypes in primary cultured and genotyped human umbilical vein endothelial cells (HUVEC), in neuroblastoma (SK-N-BE(2)) cells and in fibroblast (CHO-K1) cells. Also, the putative effects of the L7P polymorphism on proliferation, apoptosis and LDL and nitric oxide metabolism were investigated. In the course of the studies a fragment of NPY targeted to mitochondria was found. With the putative mitochondrial NPY fragment the aim was to study the translational preferences and the mobility of the protein. The intracellular distribution of NPY between the [p.L7]+[p.L7] and the [p.L7]+[p.P7] genotypes was found to be different. NPY immunoreactivity was prominent in the [p.L7]+[p.P7] cells while the proNPY immunoreactivity was prominent in the [p.L7]+[p.L7] genotype cells. In the proliferation experiments there was a difference in the [p.L7]+[p.L7] genotype cells between early and late passage (aged) cells; the proliferation was raised in the aged cells. NPY increased the growth of the cells with the [p.L7]+[p.P7] genotype. Apoptosis did not seem to differ between the genotypes, but in the aged cells with the [p.L7]+[p.L7] genotype, LDL uptake was found to be elevated. Furthermore, the genotype seemed to have a strong effect on the nitric oxide metabolism. The results indicated that the mobility of NPY protein inside the cells was increased within the P7 containing constructs. The existence of the mitochondria targeted NPY fragment was verified, and translational preferences were proved to be due to the origin of the cells. Cell of neuronal origin preferred the translation of mature NPY (NPY1-36) when compared to the non neuronal cells that translated both, NPY and the mitochondrial fragment of NPY. The mobility of the mitochondrial fragment was found to be minimal. The functionality of the mitochondrial NPY fragment remains to be investigated. L7P polymorphism in the preproNPY causes a series of intracellular changes. These changes may contribute to the state of cellular senescence, vascular tone and lead to endothelial dysfunction and even to increased susceptibility to diseases, like atherosclerosis and T2 diabetes.
Resumo:
The research around performance measurement and management has focused mainly on the design, implementation and use of performance measurement systems. However, there is little evidence about the actual impacts of performance measurement on the different levels of business and operations of organisations, as well as the underlying factors that lead to a positive impact of performance measurement. The study thus focuses on this research gap, which can be considered both important and challenging to cover. The first objective of the study was to examine the impacts of performance measurement on different aspects of management, leadership and the quality of working life, after which the factors that facilitate and improve performance and performance measurement at the operative level of an organisation were examined. The second objective was to study how these factors operate in practice. The third objective focused on the construction of a framework for successful operative level performance measurement and the utilisation of the factors in the organisations. The research objectives have been studied through six research papers utilising empirical data from three separate studies, including two sets of interview data and one of quantitative data. The study applies mainly the hermeneutical research approach. As a contribution of the study, a framework for successful operative level performance measurement was formed by matching the findings of the current study and performance measurement theory. The study extents the prior research regarding the impacts of performance measurement and the factors that have a positive effect on operative level performance and performance measurement. The results indicate that under suitable circumstances, performance measurement has positive impacts on different aspects of management, leadership, and the quality of working life. The results reveal that for example the perception of the employees and the management of the impacts of performance measurement on leadership style differ considerably. Furthermore, the fragmented literature has been reorganised into six factors that facilitate and improve the performance of the operations and employees, and the use of performance measurement at the operative level of an organisation. Regarding the managerial implications of the study, managers who operate around performance measurement can utilise the framework for example by putting the different phases of the framework into practice.
Resumo:
The productivity, quality and cost efficiency of welding work are critical for metal industry today. Welding processes must get more effective and this can be done by mechanization and automation. Those systems are always expensive and they have to pay the investment back. In this case it is really important to optimize the needed intelligence and this way needed automation level, so that a company will get the best profit. This intelligence and automation level was earlier classified in several different ways which are not useful for optimizing the process of automation or mechanization of welding. In this study the intelligence of a welding system is defined in a new way to enable the welding system to produce a weld good enough. In this study a new way is developed to classify and select the internal intelligence level of a welding system needed to produce the weld efficiently. This classification contains the possible need of human work and its effect to the weld and its quality but does not exclude any different welding processes or methods. In this study a totally new way is developed to calculate the best optimization for the needed intelligence level in welding. The target of this optimization is the best possible productivity and quality and still an economically optimized solution for several different cases. This new optimizing method is based on grounds of product type, economical productivity, the batch size of products, quality and criteria of usage. Intelligence classification and optimization were never earlier made by grounds of a made product. Now it is possible to find the best type of welding system needed to welddifferent types of products. This calculation process is a universal way for optimizing needed automation or mechanization level when improving productivity of welding. This study helps the industry to improve productivity, quality and cost efficiency of welding workshops.