919 resultados para advanced accounting management
Resumo:
Tutkielman tarkoituksena oli tutkia palkkakirjanpitoon liittyviä näkökulmia ja niiden vaikutuksia talousohjaukseen sekä tuoda esille kehitysideoita palkkakirjanpitäjien ja controllerien välisen yhteistyön kehittämiseksi. Lähtökohtana tarkasteltiin talousohjauksen ja palkkakirjanpidon yhteistyön nykytilaa ja aiempia tutkimuksia. Tämä tutkimus on empiirinen tapaustutkimus jossa tutkimusmetodina on käytetty kvalitatiivista analyysiä. Tutkimus on toteutettu puolistrukturoituja haastatteluja ja havainnointia apuna käyttäen. Tutkimuksessa nousi esille globalisaation ja yhteiskunnan muutoksen tuo-mat haasteet. Muutosten myötä molempien osapuolten tehtävien muuttu-minen ja erilaisissa palvelukeskuksissa toimiminen on muuttanut toimintaa ja osapuolten rooleja. Tulokset osoittavat että palkkakirjanpitäjien tiedon tuottamista controllereille, voidaan kehittää parhaiten panostamalla tiedon välitykseen, yhteistyönrakentamiseen ja tuottamalla erilaisia raportteja ennusteiden tueksi. Lisäksi on ymmärrettävä taustatekijät ja kasvatettava molempien osapuolten talousohjauksen ja palkkakirjanpidon tuntemusta. Nämä muutokset havainnollistamalla päästään rakentamaan hyvää ja laadukasta tiedon tuottamista, jolloin palkkakirjanpitäjät voivat olla paremmin osana liike-elämän tukemista.
Resumo:
Tässä kandidaatintyössä on tutkittu hinnoittelua ja asiakaskannattavuutta johdon laskentatoimen ja markkinoinnin kirjallisuuksissa. Työn tarkoituksena oli tarkastella näiden alojen välisiä eroja hinnoittelussa ja asiakaskannattavuudessa, sekä osoittaa kuinka niitä käytetään, ja miten kyseiset aiheet linkittyvät toisiinsa. Työ on toteutettu kirjallisuustutkimuksena hyödyntäen painettua kirjallisuutta ja elektronista tiedonhakua. Hinnoittelu eroaa suuresti markkinoinnin ja johdon laskentatoimen kirjallisuuksissa. Johdon laskentatoimi käsittelee nykyhetkeä analysoivia asiakaskannattavuuden mittausmalleja, kun taas markkinointi käsittelee tulevaisuutta analysoivia menetelmiä. Sekä johdon laskentatoimi että markkinointi on kiinnostunut kasvavissa määrin asiakaskannattavuuden analysoimisesta. Hinnoittelu ja asiakaskannattavuus vaikuttavat suoraan toisiinsa. ABC-laskennan avulla voidaan toteuttaa analyyseja tuotekustannuksista ja asiakkaista samanaikaisesti. Tulevaisuudessa näiden aihealueiden ja alojen synergiaa parantamalla yritykset pystyisivät kehittämään kannattavuuttaan suuresti.
Resumo:
This Bachelor's thesis examines utilization of information produced by financial management in SME.
Resumo:
As exploration of our solar system and outerspace move into the future, spacecraft are being developed to venture on increasingly challenging missions with bold objectives. The spacecraft tasked with completing these missions are becoming progressively more complex. This increases the potential for mission failure due to hardware malfunctions and unexpected spacecraft behavior. A solution to this problem lies in the development of an advanced fault management system. Fault management enables spacecraft to respond to failures and take repair actions so that it may continue its mission. The two main approaches developed for spacecraft fault management have been rule-based and model-based systems. Rules map sensor information to system behaviors, thus achieving fast response times, and making the actions of the fault management system explicit. These rules are developed by having a human reason through the interactions between spacecraft components. This process is limited by the number of interactions a human can reason about correctly. In the model-based approach, the human provides component models, and the fault management system reasons automatically about system wide interactions and complex fault combinations. This approach improves correctness, and makes explicit the underlying system models, whereas these are implicit in the rule-based approach. We propose a fault detection engine, Compiled Mode Estimation (CME) that unifies the strengths of the rule-based and model-based approaches. CME uses a compiled model to determine spacecraft behavior more accurately. Reasoning related to fault detection is compiled in an off-line process into a set of concurrent, localized diagnostic rules. These are then combined on-line along with sensor information to reconstruct the diagnosis of the system. These rules enable a human to inspect the diagnostic consequences of CME. Additionally, CME is capable of reasoning through component interactions automatically and still provide fast and correct responses. The implementation of this engine has been tested against the NEAR spacecraft advanced rule-based system, resulting in detection of failures beyond that of the rules. This evolution in fault detection will enable future missions to explore the furthest reaches of the solar system without the burden of human intervention to repair failed components.
Resumo:
The parasitoid Encarsia formosa Gahan (Hymenoptera: Aphelinidae) has been used successfully for the control of Trialeurodes vaporariorum (Westwood) (Homoptera: Aleyrodidae). The development of UV-blocking plastic films has added a new component to future integrated pest management systems by disrupting insect pest infestation when UV light is excluded. Because both T. vaporariorum and E. formosa are reported to have similar spectral efficiency, there was a need to identify the impact of UV-blocking films on the dispersal behavior of both the pest and the natural enemy. In field studies, using choice-chamber experiments, E. formosa showed some preference to disperse into compartments where less UV light was blocked. However, further studies indicated that the effect was primarily attributable to the different light diffusion properties of the films tested. Thus, unlike its whitefly host, when the UV-absorbing properties of the films were similar, but the light diffusion properties differed, E.formosa adults preferred to disperse into compartments clad with films that had high light diffusion properties. When the plastic films differed most in their UV-absorbing capacity and had no light-diffusion capability, the initial dispersal of E. formosa between treatments was similar, although a small preference toward the environment with UV light was observed over time. When parasitoid dispersal was measured 3 h after release, more parasitoids were found on plants, suggesting that the parasitoids would search plants for whitefly hosts, even in a UV-blocked light environment. The potential for the integration of UV-blocking films with E. formosa in an advanced whitefly management system is discussed.
Resumo:
Natural riversare consisting of various networks as junction andstreams. And sediment and erosion are occurred by specific stream condition. When flood season,large discharge flew in the river and river bed changed by high flow velocity. Especially junction area’s flow characteristics are very complex. The purpose of this study is to analyze the flow characteristics in channel junction, which are most influenced by large discharge like flooding and input water from tributary. We investigate the flow characteristics by using hydrodynamics and transport module in MIKE 3 FM. MIKE 3 FM model was helpful tool to analysis 3D hydrodynamics, erosion and sediment effect from channel bed. We analyze flow characteristics at channel junction. Also we consider hydraulic structures like a bridge pier which is influencing flow characteristics like a flow velocity, water level, erosion and scour depth in channel bed. In the model, we controlled discharge condition according to Froude Number and reflect various grain diameter size and flow ratio change in main stream and tributary. In the result, flow velocity, water level, erosion and sediment depth are analyzed. Additionally, we suggest a these result relationship with equations. This study will help the understand flow characteristics and influence of hydraulic structure in channel junction. Acknowledgments This research was supported by a grant (12-TI-C01) from Advanced Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
Resumo:
Este trabalho define e implementa um sistema de controle de consumo para redes de computadores, objetivando aumentar o tempo de operação da rede em caso de operação com recursos limitados e redução de consumo de energia em situações de fornecimento normal. Na definição do sistema, denominado NetPower, foi estabelecida uma estrutura através da qual um gerente (coordenador) monitora as atividades dos equipamentos vinculados à rede, e determina alterações nos estados de consumo respectivos, de acordo com as necessidades ou atendimento de padrões de otimização. Aos equipamentos podem ser atribuídos diferentes privilégios em uma hierarquia adaptável a diversos ambientes. Um reserva oferece opção às falhas do gerente. A implementação está baseada no protocolo SNMP (Simple Network Management Protocol) para a gerência e são considerados preponderantemente os padrões para controle de consumo dos equipamentos Advanced Power Management, APM, e Advanced Configuration and Power Interface Specification, ACPI. Além da arquitetura do gerente e dos agentes, foi definida também uma MIB (Management Information Base) para controle de consumo. No projeto do sistema, foi privilegiado o objetivo de utilização em qualquer ambiente de rede, sem preferência por equipamentos de algum fabricante específico ou por arquitetura de hardware. Tecnologias de domínio público foram utilizadas, quando possível. No futuro este sistema pode fazer parte da distribuição de sistemas operacionais, incorporando controle de consumo às redes. No texto é feita uma comparação entre os softwares existentes para controle de consumo, são apresentados os recursos de controle de consumo disponíveis nos equipamentos de computação, seguido da descrição do protocolo de gerência utilizado. Em seguida, é apresentada a proposta detalhada do sistema de controle e descrita da implementação do protótipo.
Resumo:
This thesis develops and evaluates a business model for connected full electric vehicles (FEV) for the European market. Despite a promoting political environment, various barriers have thus far prevented the FEV from becoming a mass-market vehicle. Besides cost, the most noteworthy of these barriers is represented by range anxiety, a product of FEVs’ limited range, lacking availability of charging infrastructure, and long recharging times. Connected FEVs, which maintain a constant connection to the surrounding infrastructure, appear to be a promising element to overcome drivers’ range anxiety. Yet their successful application requires a well functioning FEV ecosystem which can only be created through the collaboration of various stakeholders such as original equipment manufacturers (OEM), first tier suppliers (FTS), charging infrastructure and service providers (CISP), utilities, communication enablers, and governments. This thesis explores and evaluates how a business model, jointly created by these stakeholders, could look like, i.e. how stakeholders could collaborate in the design of products, services, infrastructure, and advanced mobility management, to meet drivers with a sensible value proposition that is at least equivalent to that of internal combustion engine (ICE) cars. It suggests that this value proposition will be an end-2-end package provided by CISPs or OEMs that comprises mobility packages (incl. pay per mile plans, battery leasing, charging and battery swapping (BS) infrastructure) and FEVs equipped with an on-board unit (OBU) combined with additional services targeted at range anxiety reduction. From a theoretical point of view the thesis answers the question which business model framework is suitable for the development of a holistic, i.e. all stakeholder-comprising business model for connected FEVs and defines such a business model. In doing so the thesis provides the first comprehensive business model related research findings on connected FEVs, as prior works focused on the much less complex scenario featuring only “offline” FEVs.
Resumo:
Purpose - The purpose of this paper is to verify if Brazilian companies are adopting environmental requirements in the supplier selection process. Further, this paper intends to analyze whether there is a relation between the level of environmental management maturity and the inclusion of environmental criteria in the companies' selection of suppliers.Design/methodology/approach - A review of mainstream literature on environmental management, traditional criteria in the supplier selection process and the incorporation of environmental requirements in this context. The empirical study's strategy is based on five Brazilian case studies with industrial companies. Face-to-face interviews and informal conversations are to be held, explanations made by e-mail with representatives from the purchasing, environmental management, logistics and other areas, and observation and the collection of company documents are also employed.Findings - Based on the cases, it is concluded that companies still use traditional criteria to select suppliers, such as quality and cost, and do not adopt environmental requirements in the supplier selection process in a uniform manner. Evidence found shows that the level of environmental management maturity influences the depth with which companies adopt environmental criteria when selecting suppliers. Thus, a company with more advanced environmental management adopts more formal procedures for selecting environmentally appropriate suppliers than others.Originality/value - This is the first known study to verify if Brazilian companies are adopting environmental requirements in the supplier selection process.
Resumo:
Pós-graduação em Engenharia de Produção - FEB
Resumo:
The social networks on the internet have experienced rapid growth and joined millions of users in Brazil and throughout the world. Such networks allow groups of people to communicate and exchange information. Sharing information in files is also a growing activity on the internet and is done in various ways. However, applications are not yet available to enable file sharing on Facebook, the premier social network today. This study aims to investigate how users use Facebook, and their practices for file sharing. Due to the experimental nature of this research, we opted for a data collection survey, applied over the web. From the data analysis, we have found a frequent use of file sharing, but no interest in paid services. As for Facebook, there was an extensive use of applications. The set of results shows a favourable scenario for applications that allow file sharing on Facebook.
Resumo:
The purpose of this article is to propose a methodological approach to analyse how the positioning of actors in an interorganisational network can influence the elements of value creation. This study fills a gap in the existing literature by exploring the relationship between the positioning of actors and the value creation in a context of interorganisational networks. The case study method was employed and the data was obtained from four companies of an interorganisational network, located in Brazil, which produces earthmoving equipments. The central actor in this network is benefited through access to resources, power and information of the other network actors. The centrality position seems to help this company in the absorption and diffusion of knowledge among the other network actors. The research indicates, that a dense core (through strong ties) and redundancy (for triangulation and knowledge absorption), benefited the following value creation elements: tangible; intangible; services and economic.
Resumo:
La nanotecnología es un área de investigación de reciente creación que trata con la manipulación y el control de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. A escala nanométrica, los materiales exhiben fenómenos físicos, químicos y biológicos singulares, muy distintos a los que manifiestan a escala convencional. En medicina, los compuestos miniaturizados a nanoescala y los materiales nanoestructurados ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, así como una mejora en la focalización del medicamento hacia la diana terapéutica, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales (desde el nivel de población hasta el nivel de célula) y, por tanto, cualquier flujo de trabajo en nanomedicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Desafortunadamente, la informática biomédica todavía no ha proporcionado el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, ni ha adaptado sus métodos y herramientas a este nuevo campo de investigación. En este contexto, la nueva área de la nanoinformática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Las observaciones expuestas previamente determinan el contexto de esta tesis doctoral, la cual se centra en analizar el dominio de la nanomedicina en profundidad, así como en el desarrollo de estrategias y herramientas para establecer correspondencias entre las distintas disciplinas, fuentes de datos, recursos computacionales y técnicas orientadas a la extracción de información y la minería de textos, con el objetivo final de hacer uso de los datos nanomédicos disponibles. El autor analiza, a través de casos reales, alguna de las tareas de investigación en nanomedicina que requieren o que pueden beneficiarse del uso de métodos y herramientas nanoinformáticas, ilustrando de esta forma los inconvenientes y limitaciones actuales de los enfoques de informática biomédica a la hora de tratar con datos pertenecientes al dominio nanomédico. Se discuten tres escenarios diferentes como ejemplos de actividades que los investigadores realizan mientras llevan a cabo su investigación, comparando los contextos biomédico y nanomédico: i) búsqueda en la Web de fuentes de datos y recursos computacionales que den soporte a su investigación; ii) búsqueda en la literatura científica de resultados experimentales y publicaciones relacionadas con su investigación; iii) búsqueda en registros de ensayos clínicos de resultados clínicos relacionados con su investigación. El desarrollo de estas actividades requiere el uso de herramientas y servicios informáticos, como exploradores Web, bases de datos de referencias bibliográficas indexando la literatura biomédica y registros online de ensayos clínicos, respectivamente. Para cada escenario, este documento proporciona un análisis detallado de los posibles obstáculos que pueden dificultar el desarrollo y el resultado de las diferentes tareas de investigación en cada uno de los dos campos citados (biomedicina y nanomedicina), poniendo especial énfasis en los retos existentes en la investigación nanomédica, campo en el que se han detectado las mayores dificultades. El autor ilustra cómo la aplicación de metodologías provenientes de la informática biomédica a estos escenarios resulta efectiva en el dominio biomédico, mientras que dichas metodologías presentan serias limitaciones cuando son aplicadas al contexto nanomédico. Para abordar dichas limitaciones, el autor propone un enfoque nanoinformático, original, diseñado específicamente para tratar con las características especiales que la información presenta a nivel nano. El enfoque consiste en un análisis en profundidad de la literatura científica y de los registros de ensayos clínicos disponibles para extraer información relevante sobre experimentos y resultados en nanomedicina —patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.—, seguido del desarrollo de mecanismos para estructurar y analizar dicha información automáticamente. Este análisis concluye con la generación de un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento y de test anotados manualmente—, el cual ha sido aplicado a la clasificación de registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nanodrogas y nanodispositivos de aquellos enfocados a testear productos farmacéuticos tradicionales. El presente trabajo pretende proporcionar los métodos necesarios para organizar, depurar, filtrar y validar parte de los datos nanomédicos existentes en la actualidad a una escala adecuada para la toma de decisiones. Análisis similares para otras tareas de investigación en nanomedicina ayudarían a detectar qué recursos nanoinformáticos se requieren para cumplir los objetivos actuales en el área, así como a generar conjunto de datos de referencia, estructurados y densos en información, a partir de literatura y otros fuentes no estructuradas para poder aplicar nuevos algoritmos e inferir nueva información de valor para la investigación en nanomedicina. ABSTRACT Nanotechnology is a research area of recent development that deals with the manipulation and control of matter with dimensions ranging from 1 to 100 nanometers. At the nanoscale, materials exhibit singular physical, chemical and biological phenomena, very different from those manifested at the conventional scale. In medicine, nanosized compounds and nanostructured materials offer improved drug targeting and efficacy with respect to traditional formulations, and reveal novel diagnostic and therapeutic properties. Nevertheless, the complexity of information at the nano level is much higher than the complexity at the conventional biological levels (from populations to the cell). Thus, any nanomedical research workflow inherently demands advanced information management. Unfortunately, Biomedical Informatics (BMI) has not yet provided the necessary framework to deal with such information challenges, nor adapted its methods and tools to the new research field. In this context, the novel area of nanoinformatics aims to build new bridges between medicine, nanotechnology and informatics, allowing the application of computational methods to solve informational issues at the wide intersection between biomedicine and nanotechnology. The above observations determine the context of this doctoral dissertation, which is focused on analyzing the nanomedical domain in-depth, and developing nanoinformatics strategies and tools to map across disciplines, data sources, computational resources, and information extraction and text mining techniques, for leveraging available nanomedical data. The author analyzes, through real-life case studies, some research tasks in nanomedicine that would require or could benefit from the use of nanoinformatics methods and tools, illustrating present drawbacks and limitations of BMI approaches to deal with data belonging to the nanomedical domain. Three different scenarios, comparing both the biomedical and nanomedical contexts, are discussed as examples of activities that researchers would perform while conducting their research: i) searching over the Web for data sources and computational resources supporting their research; ii) searching the literature for experimental results and publications related to their research, and iii) searching clinical trial registries for clinical results related to their research. The development of these activities will depend on the use of informatics tools and services, such as web browsers, databases of citations and abstracts indexing the biomedical literature, and web-based clinical trial registries, respectively. For each scenario, this document provides a detailed analysis of the potential information barriers that could hamper the successful development of the different research tasks in both fields (biomedicine and nanomedicine), emphasizing the existing challenges for nanomedical research —where the major barriers have been found. The author illustrates how the application of BMI methodologies to these scenarios can be proven successful in the biomedical domain, whilst these methodologies present severe limitations when applied to the nanomedical context. To address such limitations, the author proposes an original nanoinformatics approach specifically designed to deal with the special characteristics of information at the nano level. This approach consists of an in-depth analysis of the scientific literature and available clinical trial registries to extract relevant information about experiments and results in nanomedicine —textual patterns, common vocabulary, experiment descriptors, characterization parameters, etc.—, followed by the development of mechanisms to automatically structure and analyze this information. This analysis resulted in the generation of a gold standard —a manually annotated training or reference set—, which was applied to the automatic classification of clinical trial summaries, distinguishing studies focused on nanodrugs and nanodevices from those aimed at testing traditional pharmaceuticals. The present work aims to provide the necessary methods for organizing, curating and validating existing nanomedical data on a scale suitable for decision-making. Similar analysis for different nanomedical research tasks would help to detect which nanoinformatics resources are required to meet current goals in the field, as well as to generate densely populated and machine-interpretable reference datasets from the literature and other unstructured sources for further testing novel algorithms and inferring new valuable information for nanomedicine.
Resumo:
Some lessons copyrighted by the Tanner-Gilman schools.