856 resultados para green information systems
Resumo:
Abstract Background Recent medical and biological technology advances have stimulated the development of new testing systems that have been providing huge, varied amounts of molecular and clinical data. Growing data volumes pose significant challenges for information processing systems in research centers. Additionally, the routines of genomics laboratory are typically characterized by high parallelism in testing and constant procedure changes. Results This paper describes a formal approach to address this challenge through the implementation of a genetic testing management system applied to human genome laboratory. We introduced the Human Genome Research Center Information System (CEGH) in Brazil, a system that is able to support constant changes in human genome testing and can provide patients updated results based on the most recent and validated genetic knowledge. Our approach uses a common repository for process planning to ensure reusability, specification, instantiation, monitoring, and execution of processes, which are defined using a relational database and rigorous control flow specifications based on process algebra (ACP). The main difference between our approach and related works is that we were able to join two important aspects: 1) process scalability achieved through relational database implementation, and 2) correctness of processes using process algebra. Furthermore, the software allows end users to define genetic testing without requiring any knowledge about business process notation or process algebra. Conclusions This paper presents the CEGH information system that is a Laboratory Information Management System (LIMS) based on a formal framework to support genetic testing management for Mendelian disorder studies. We have proved the feasibility and showed usability benefits of a rigorous approach that is able to specify, validate, and perform genetic testing using easy end user interfaces.
Resumo:
Quando si parla di green information technology si fa riferimento a un nuovo filone di ricerche focalizzate sulle tecnologie ecologiche o verdi rivolte al rispetto ambientale. In prima battuta ci si potrebbe chiedere quali siano le reali motivazioni che possono portare allo studio di tecnologie green nel settore dell’information technology: sono così inquinanti i computer? Non sono le automobili, le industrie, gli aerei, le discariche ad avere un impatto inquinante maggiore sull’ambiente? Certamente sì, ma non bisogna sottovalutare l’impronta inquinante settore IT; secondo una recente indagine condotta dal centro di ricerche statunitense Gartner nel 2007, i sistemi IT sono tra le maggiori fonti di emissione di CO2 e di altri gas a effetto serra , con una percentuale del 2% sulle emissioni totali del pianeta, eguagliando il tasso di inquinamento del settore aeromobile. Il numero enorme di computer disseminato in tutto il mondo assorbe ingenti quantità di energia elettrica e le centrali che li alimentano emettono tonnellate di anidride carbonica inquinando l’atmosfera. Con questa tesi si vuole sottolineare l’impatto ambientale del settore verificando, attraverso l’analisi del bilancio sociale ed ambientale, quali misure siano state adottate dai leader del settore informatico. La ricerca è volta a dimostrare che le più grandi multinazionali informatiche siano consapevoli dell’inquinamento prodotto, tuttavia non adottano abbastanza soluzioni per limitare le emissioni, fissando futili obiettivi futuri.
Resumo:
In den letzten drei Jahrzehnten sind Fernerkundung und GIS in den Geowissenschaften zunehmend wichtiger geworden, um die konventionellen Methoden von Datensammlung und zur Herstellung von Landkarten zu verbessern. Die vorliegende Arbeit befasst sich mit der Anwendung von Fernerkundung und geographischen Informationssystemen (GIS) für geomorphologische Untersuchungen. Durch die Kombination beider Techniken ist es vor allem möglich geworden, geomorphologische Formen im Überblick und dennoch detailliert zu erfassen. Als Grundlagen werden in dieser Arbeit topographische und geologische Karten, Satellitenbilder und Klimadaten benutzt. Die Arbeit besteht aus 6 Kapiteln. Das erste Kapitel gibt einen allgemeinen Überblick über den Untersuchungsraum. Dieser umfasst folgende morphologische Einheiten, klimatischen Verhältnisse, insbesondere die Ariditätsindizes der Küsten- und Gebirgslandschaft sowie das Siedlungsmuster beschrieben. Kapitel 2 befasst sich mit der regionalen Geologie und Stratigraphie des Untersuchungsraumes. Es wird versucht, die Hauptformationen mit Hilfe von ETM-Satellitenbildern zu identifizieren. Angewandt werden hierzu folgende Methoden: Colour Band Composite, Image Rationing und die sog. überwachte Klassifikation. Kapitel 3 enthält eine Beschreibung der strukturell bedingten Oberflächenformen, um die Wechselwirkung zwischen Tektonik und geomorphologischen Prozessen aufzuklären. Es geht es um die vielfältigen Methoden, zum Beispiel das sog. Image Processing, um die im Gebirgskörper vorhandenen Lineamente einwandfrei zu deuten. Spezielle Filtermethoden werden angewandt, um die wichtigsten Lineamente zu kartieren. Kapitel 4 stellt den Versuch dar, mit Hilfe von aufbereiteten SRTM-Satellitenbildern eine automatisierte Erfassung des Gewässernetzes. Es wird ausführlich diskutiert, inwieweit bei diesen Arbeitsschritten die Qualität kleinmaßstäbiger SRTM-Satellitenbilder mit großmaßstäbigen topographischen Karten vergleichbar ist. Weiterhin werden hydrologische Parameter über eine qualitative und quantitative Analyse des Abflussregimes einzelner Wadis erfasst. Der Ursprung von Entwässerungssystemen wird auf der Basis geomorphologischer und geologischer Befunde interpretiert. Kapitel 5 befasst sich mit der Abschätzung der Gefahr episodischer Wadifluten. Die Wahrscheinlichkeit ihres jährlichen Auftretens bzw. des Auftretens starker Fluten im Abstand mehrerer Jahre wird in einer historischen Betrachtung bis 1921 zurückverfolgt. Die Bedeutung von Regentiefs, die sich über dem Roten Meer entwickeln, und die für eine Abflussbildung in Frage kommen, wird mit Hilfe der IDW-Methode (Inverse Distance Weighted) untersucht. Betrachtet werden außerdem weitere, regenbringende Wetterlagen mit Hilfe von Meteosat Infrarotbildern. Genauer betrachtet wird die Periode 1990-1997, in der kräftige, Wadifluten auslösende Regenfälle auftraten. Flutereignisse und Fluthöhe werden anhand von hydrographischen Daten (Pegelmessungen) ermittelt. Auch die Landnutzung und Siedlungsstruktur im Einzugsgebiet eines Wadis wird berücksichtigt. In Kapitel 6 geht es um die unterschiedlichen Küstenformen auf der Westseite des Roten Meeres zum Beispiel die Erosionsformen, Aufbauformen, untergetauchte Formen. Im abschließenden Teil geht es um die Stratigraphie und zeitliche Zuordnung von submarinen Terrassen auf Korallenriffen sowie den Vergleich mit anderen solcher Terrassen an der ägyptischen Rotmeerküste westlich und östlich der Sinai-Halbinsel.
Resumo:
The need to effectively manage the documentation covering the entire production process, from the concept phase right through to market realise, constitutes a key issue in the creation of a successful and highly competitive product. For almost forty years the most commonly used strategies to achieve this have followed Product Lifecycle Management (PLM) guidelines. Translated into information management systems at the end of the '90s, this methodology is now widely used by companies operating all over the world in many different sectors. PLM systems and editor programs are the two principal types of software applications used by companies for their process aotomation. Editor programs allow to store in documents the information related to the production chain, while the PLM system stores and shares this information so that it can be used within the company and made it available to partners. Different software tools, which capture and store documents and information automatically in the PLM system, have been developed in recent years. One of them is the ''DirectPLM'' application, which has been developed by the Italian company ''Focus PLM''. It is designed to ensure interoperability between many editors and the Aras Innovator PLM system. In this dissertation we present ''DirectPLM2'', a new version of the previous software application DirectPLM. It has been designed and developed as prototype during the internship by Focus PLM. Its new implementation separates the abstract logic of business from the real commands implementation, previously strongly dependent on Aras Innovator. Thanks to its new design, Focus PLM can easily develop different versions of DirectPLM2, each one devised for a specific PLM system. In fact, the company can focus the development effort only on a specific set of software components which provides specialized functions interacting with that particular PLM system. This allows shorter Time-To-Market and gives the company a significant competitive advantage.
Resumo:
A post classification change detection technique based on a hybrid classification approach (unsupervised and supervised) was applied to Landsat Thematic Mapper (TM), Landsat Enhanced Thematic Plus (ETM+), and ASTER images acquired in 1987, 2000 and 2004 respectively to map land use/cover changes in the Pic Macaya National Park in the southern region of Haiti. Each image was classified individually into six land use/cover classes: built-up, agriculture, herbaceous, open pine forest, mixed forest, and barren land using unsupervised ISODATA and maximum likelihood supervised classifiers with the aid of field collected ground truth data collected in the field. Ground truth information, collected in the field in December 2007, and including equalized stratified random points which were visual interpreted were used to assess the accuracy of the classification results. The overall accuracy of the land classification for each image was respectively: 1987 (82%), 2000 (82%), 2004 (87%). A post classification change detection technique was used to produce change images for 1987 to 2000, 1987 to 2004, and 2000 to 2004. It was found that significant changes in the land use/cover occurred over the 17- year period. The results showed increases in built up (from 10% to 17%) and herbaceous (from 5% to 14%) areas between 1987 and 2004. The increase of herbaceous was mostly caused by the abandonment of exhausted agriculture lands. At the same time, open pine forest and mixed forest areas lost (75%) and (83%) of their area to other land use/cover types. Open pine forest (from 20% to 14%) and mixed forest (from18 to 12%) were transformed into agriculture area or barren land. This study illustrated the continuing deforestation, land degradation and soil erosion in the region, which in turn is leading to decrease in vegetative cover. The study also showed the importance of Remote Sensing (RS) and Geographic Information System (GIS) technologies to estimate timely changes in the land use/cover, and to evaluate their causes in order to design an ecological based management plan for the park.
Resumo:
he notion of outsourcing – making arrangements with an external entity for the provision of goods or services to supplement or replace internal efforts – has been around for centuries. The outsourcing of information systems (IS) is however a much newer concept but one which has been growing dramatically. This book attempts to synthesize what is known about IS outsourcing by dividing the subject into three interrelated parts: (1) Traditional Information Technology Outsourcing, (2) Information Technolgy Offshoring, and (3) Business Process Outsourcing. The book should be of interest to all academics and students in the field of Information Systems as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.
Resumo:
The practice of information systems (IS) outsourcing is widely established among organizations. Nonetheless, evidence suggests that organizations differ considerably in the extent to which they deploy IS outsourcing. This variation has motivated research into the determinants of the IS outsourcing decision. Most of this research is based on the assumption that a decision on the outsourcing of a particular IS function is made independently of other IS functions. This modular view ignores the systemic nature of the IS function, which posits that IS effectiveness depends on how the various IS functions work together effectively. This study proposes that systemic influences are important criteria in evaluating the outsourcing option. It further proposes that the recognition of systemic influences in outsourcing decisions is culturally sensitive. Specifically, we provide evidence that systemic effects are factored into the IS outsourcing decision differently in more individualist cultures than in collectivist ones. Our results of a survey of United States and German firms indicate that perceived in-house advantages in the systemic impact of an IS function are, indeed, a significant determinant of IS outsourcing in a moderately individualist country (i.e., Germany), whereas insignificant in a strongly individualist country (i.e., the United States). The country differences are even stronger with regard to perceived in-house advantages in the systemic view of IS professionals. In fact, the direction of this impact is reversed in the United States sample. Other IS outsourcing determinants that were included as controls, such as cost efficiency, did not show significant country differences.
Resumo:
Currently more than half of Electronic Health Record (EHR) projects fail. Most of these failures are not due to flawed technology, but rather due to the lack of systematic considerations of human issues. Among the barriers for EHR adoption, function mismatching among users, activities, and systems is a major area that has not been systematically addressed from a human-centered perspective. A theoretical framework called Functional Framework was developed for identifying and reducing functional discrepancies among users, activities, and systems. The Functional Framework is composed of three models – the User Model, the Designer Model, and the Activity Model. The User Model was developed by conducting a survey (N = 32) that identified the functions needed and desired from the user’s perspective. The Designer Model was developed by conducting a systemic review of an Electronic Dental Record (EDR) and its functions. The Activity Model was developed using an ethnographic method called shadowing where EDR users (5 dentists, 5 dental assistants, 5 administrative personnel) were followed quietly and observed for their activities. These three models were combined to form a unified model. From the unified model the work domain ontology was developed by asking users to rate the functions (a total of 190 functions) in the unified model along the dimensions of frequency and criticality in a survey. The functional discrepancies, as indicated by the regions of the Venn diagrams formed by the three models, were consistent with the survey results, especially with user satisfaction. The survey for the Functional Framework indicated the preference of one system over the other (R=0.895). The results of this project showed that the Functional Framework provides a systematic method for identifying, evaluating, and reducing functional discrepancies among users, systems, and activities. Limitations and generalizability of the Functional Framework were discussed.