975 resultados para data management policies


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O e-mail é atualmente a principal forma de comunicação das organizações. A informação existente nos e-mails representa uma elevada dimensão do capital não tangível das empresas, o Conhecimento. Na realidade, a gestão do e-mail dentro da organização é realizada por cada colaborador individualmente. Contudo, e tendo em conta o volume de informação criada, a forma como o e-mail é gerido levanta algumas questões. O objetivo geral deste estudo é conhecer como é feita a gestão de e-mail nos Serviços do Instituto Politécnico de Viana do Castelo (IPVC), analisando a forma de utilização do e-mail, de modo a obter elementos que permitam desenvolver normas/orientações para melhoria desta gestão. A metodologia utilizada para a elaboração deste trabalho de investigação recaiu sobre o Método Quadripolar, através do estudo de caso, para analisar a gestão do e-mail nos Serviços do IPVC. Foi elaborado um inquérito por questionário numa primeira fase, dirigido a todas as Instituições Públicas de Ensino Superior em Portugal sobre a existência de politicas de gestão de e-mail e, posteriormente, um questionário dirigido a todos os funcionários de todos os Serviços do IPVC no sentido de perceber de que forma se lida com o e-mail e com a sua gestão no contexto laboral de uma organização. A análise de dados foi efetuada através do estabelecimento da relação/associação entre variáveis com recurso à estatística descritiva. Os resultados demonstram que ainda há um longo caminho a percorrer uma vez que ainda não há uma consciencialização permanente para a importância do e-mail, bem como da sua gestão, no sentido de conseguir recuperar a informação de uma forma célere e garantir a preservação do e-mail a longo prazo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Within a research project on «academic excellence in the state school», this paper is a contribution to the sociological reflection on the cultural and organisational characteristics of the school and its relationship with the academic success of students. The data we present stem from a case study underway at a secondary school in the north of Portugal, referring to the universe of students that since 2003 have distinguished themselves for achieving grades equal to or greater than 18 (on a scale of 0 to 20) and have thus been included in the school’s Framework of Excellence. From a contextual approach to this educational practice, we focused on the cultural characteristics of the school/subject as analytical support for the study of school and non-school dimensions in their mutual connections. To this end, we used the information from document analysis and data collected from a questionnaire survey administered to more than two-thirds of the students included in the above-mentioned Framework of Excellence. Subsequently, we will use the data from this survey to understand the extent to which academic excellence is perceived as an indivisible social construction of the school’s political and organisational matrix, particularly in terms of the educational and teaching guidelines adopted by the management body. We will conclude by questioning the meaning of the school’s management policies regarding the emphasis on educational outcomes, with particular focus on the representations of excellent students in the processes of school leadership, teaching organisation, school merit and justice.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão Industrial

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research described in this thesis has been developed as a part of the Reliability and Field Data Management for Multi-component Products (REFIDAM) Project. This project was founded under the Applied Research Grants Scheme administered by Enterprise Ireland. The project was a partnership between Galway-Mayo Institute of Technology and Thermo King Europe. The project aimed to develop a system in order to manage the information required for reliability assessment and improvement of multi-component products, by establishing information flows within the company and information exchange with fleet users.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research described in this thesis has been developed as a part of the Reliability and Field Data Management for Multi-Component Products (REFIDAM) Project. This project was funded under the Applied Research Grants Scheme administered by Enterprise Ireland. The project was a partnership between Galway-Mayo Institute of Technology and an industrial company, Thermo King Europe. The project aimed to develop a system to manage the information required for maintenance costing, cost of ownership, reliability assessment and improvement of multi-component products, by establishing information flows between the customer network and across the Thermo King organisation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: DNA sequence integrity, mRNA concentrations and protein-DNA interactions have been subject to genome-wide analyses based on microarrays with ever increasing efficiency and reliability over the past fifteen years. However, very recently novel technologies for Ultra High-Throughput DNA Sequencing (UHTS) have been harnessed to study these phenomena with unprecedented precision. As a consequence, the extensive bioinformatics environment available for array data management, analysis, interpretation and publication must be extended to include these novel sequencing data types. DESCRIPTION: MIMAS was originally conceived as a simple, convenient and local Microarray Information Management and Annotation System focused on GeneChips for expression profiling studies. MIMAS 3.0 enables users to manage data from high-density oligonucleotide SNP Chips, expression arrays (both 3'UTR and tiling) and promoter arrays, BeadArrays as well as UHTS data using MIAME-compliant standardized vocabulary. Importantly, researchers can export data in MAGE-TAB format and upload them to the EBI's ArrayExpress certified data repository using a one-step procedure. CONCLUSION: We have vastly extended the capability of the system such that it processes the data output of six types of GeneChips (Affymetrix), two different BeadArrays for mRNA and miRNA (Illumina) and the Genome Analyzer (a popular Ultra-High Throughput DNA Sequencer, Illumina), without compromising on its flexibility and user-friendliness. MIMAS, appropriately renamed into Multiomics Information Management and Annotation System, is currently used by scientists working in approximately 50 academic laboratories and genomics platforms in Switzerland and France. MIMAS 3.0 is freely available via http://multiomics.sourceforge.net/.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of "Big Data" available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations. This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The constant scientific production in the universities and in the research centers makes these organizations produce and acquire a great amount of data in a short period of time. Due to the big quantity of data, the research organizations become potentially vulnerable to the impacts on information booms that may cause a chaos as far as information management is concerned. In this context, the development of data catalogues comes up as one possible solution to the problems such as (I) the organization and (II) the data management. In the scientific scope, the data catalogues are implemented with the standard for digital and geospatial metadata and are broadly utilized in the process of producing a catalogue of scientific information. The aim of this work is to present the characteristics of access and storage of metadata in databank systems in order to improve the description and dissemination of scientific data. Relevant aspects will be considered and they should be analyzed during the stage of planning, once they can determine the success of implementation. The use of data catalogues by research organizations may be a way to promote and facilitate the dissemination of scientific data, avoid the repetition of efforts while being executed, as well as incentivate the use of collected, processed an also stored.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Nowadays the used fuel variety in power boilers is widening and new boiler constructions and running models have to be developed. This research and development is done in small pilot plants where more faster analyse about the boiler mass and heat balance is needed to be able to find and do the right decisions already during the test run. The barrier on determining boiler balance during test runs is the long process of chemical analyses of collected input and outputmatter samples. The present work is concentrating on finding a way to determinethe boiler balance without chemical analyses and optimise the test rig to get the best possible accuracy for heat and mass balance of the boiler. The purpose of this work was to create an automatic boiler balance calculation method for 4 MW CFB/BFB pilot boiler of Kvaerner Pulping Oy located in Messukylä in Tampere. The calculation was created in the data management computer of pilot plants automation system. The calculation is made in Microsoft Excel environment, which gives a good base and functions for handling large databases and calculations without any delicate programming. The automation system in pilot plant was reconstructed und updated by Metso Automation Oy during year 2001 and the new system MetsoDNA has good data management properties, which is necessary for big calculations as boiler balance calculation. Two possible methods for calculating boiler balance during test run were found. Either the fuel flow is determined, which is usedto calculate the boiler's mass balance, or the unburned carbon loss is estimated and the mass balance of the boiler is calculated on the basis of boiler's heat balance. Both of the methods have their own weaknesses, so they were constructed parallel in the calculation and the decision of the used method was left to user. User also needs to define the used fuels and some solid mass flowsthat aren't measured automatically by the automation system. With sensitivity analysis was found that the most essential values for accurate boiler balance determination are flue gas oxygen content, the boiler's measured heat output and lower heating value of the fuel. The theoretical part of this work concentrates in the error management of these measurements and analyses and on measurement accuracy and boiler balance calculation in theory. The empirical part of this work concentrates on the creation of the balance calculation for the boiler in issue and on describing the work environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Kiihtyvä kilpailu yritysten välillä on tuonut yritykset vaikeidenhaasteiden eteen. Tuotteet pitäisi saada markkinoille nopeammin, uusien tuotteiden pitäisi olla parempia kuin vanhojen ja etenkin parempia kuin kilpailijoiden vastaavat tuotteet. Lisäksi tuotteiden suunnittelu-, valmistus- ja muut kustannukset eivät saisi olla suuria. Näiden haasteiden toteuttamisessa yritetään usein käyttää apuna tuotetietoja, niiden hallintaa ja vaihtamista. Andritzin, kuten muidenkin yritysten, on otettava nämä asiat huomioon pärjätäkseen kilpailussa. Tämä työ on tehty Andritzille, joka on maailman johtavia paperin ja sellun valmistukseen tarkoitettujen laitteiden valmistajia ja huoltopalveluiden tarjoajia. Andritz on ottamassa käyttöön ERP-järjestelmän kaikissa toimipisteissään. Sitä halutaan hyödyntää mahdollisimman tehokkaasti, joten myös tuotetiedot halutaan järjestelmään koko elinkaaren ajalta. Osan tuotetiedoista luo Andritzin kumppanit ja alihankkijat, joten myös tietojen vaihto partnereiden välillä halutaan hoitaasiten, että tiedot saadaan suoraan ERP-järjestelmään. Tämän työn tavoitteena onkin löytää ratkaisu, jonka avulla Andritzin ja sen kumppaneiden välinen tietojenvaihto voidaan hoitaa. Tämä diplomityö esittelee tuotetietojen, niiden hallinnan ja vaihtamisen tarkoituksen ja tärkeyden. Työssä esitellään erilaisia ratkaisuvaihtoehtoja tiedonvaihtojärjestelmän toteuttamiseksi. Osa niistä perustuu yleisiin ja toimialakohtaisiin standardeihin. Myös kaksi kaupallista tuotetta esitellään. Tarkasteltavana onseuraavat standardit: PaperIXI, papiNet, X-OSCO, PSK-standardit sekä RosettaNet. Lisäksi työssä tarkastellaan ERP-järjestelmän toimittajan, SAP:in ratkaisuja tietojenvaihtoon. Näistä vaihtoehdoista parhaimpia tarkastellaan vielä yksityiskohtaisemmin ja lopuksi eri ratkaisuja vertaillaan keskenään, jotta löydettäisiin Andritzin tarpeisiin paras vaihtoehto.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Diplomityön tavoitteena on tutkia mitä uusia tiedonhallinnallisia ongelmia ilmenee, kun massaräätälöidyn tuotteen tuotetieto hallitaan läpi tuotteen elinkaaren, sekä miten nämä ongelmat voitaisiin ratkaista. Ongelmat ja haasteet kerätään kirjallisuuslähteistä ja massaräätälöintiprosessi yhdistetään PLM-vaiheisiin. Ratkaisua tutkitaan testaamalla kuinka standardit STEP ja PLCS sekä standardeja tukeva PLM järjestelmä voisivat tukea massaräätälöidyn tuotteen elinkaaren tiedonhallintaa. MC tuotteiden ongelmia ovat tuoterakenteen monimutkaisuus, jäljitettävyys ja muutosten hallinta läpi elinkaaren. STEP ja PLCS pystyvät kummatkin tahollaan tukemaan tiedonhallintaa. MC-tuotteen geneerinen tuoterakenne on kuitenkin manuaalisesti liittettävä elinkaaritiedon tukemiseen. PLM-järjestelmä pystyy tukemaan MC-tuotteiden elinkaarta, mutta koska toiminto ei ole järjestelmään sisäänrakennettuna, MC-tuotteiden tukemisen parantamisessa on edelleen haasteita.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Tällä hetkellä kolmannen sukupolven matkapuhelinjärjestelmät ovat siirtyneet kaupalliseen vaiheeseen. Universal Mobile Telecommunication System (UMTS) on eräs kolmannen sukupolven matkapuhelinjärjestelmä, jota tullaan käyttämään Euroopassa. Diplomityön päämääränä on tutkia, kuinka pakettivälitteistä tiedonsiirtoa hallitaan UMTS - verkoissa. Diplomityö antaa yleiskuvan toisen sukupolven matkapuhelinjärjestelmien datapalveluiden kehityksestä kolmannen sukupolven nopeisiin matkapuhelinjärjestelmiin. Pakettivälitteisen verkon verkkoarkkitehtuuri on esitetty sekä sen, diplomityön kannalta, tärkeimpien osien toiminnallisuus on selvitetty. Myös pakettipohjaisten datayhteyksien eli istuntojen muodostaminen ja vapauttaminen sekä aktiivisen yhteyden ominaisuuksien muokkaaminen on esitetty tässä diplomityössä. Yhteydenhallintaprotokolla, Session Management (SM), on yksi protokolla, joka osallistuu pakettidatayhteyden hallintaan. SM -protokolla on käsitelty työssä yksityiskohtaisesti. SM -protokollan SDL toteutus on esitetty diplomityön käytännönosassa