875 resultados para Data and Information Technology
Resumo:
The continental margin of southeast Brazil is elevated. Onshore Tertiary basins and Late Cretaceous/Paleogene intrusions are good evidence for post breakup tectono-magmatic activity. To constrain the impact of post-rift reactivation on the geological history of the area, we carried out a new thermochronological study. Apatite fission track ages range from 60.7 +/- 1.9 Ma to 129.3 +/- 4.3 Ma, mean track lengths from 11.41 +/- 0.23 mu m to 14.31 +/- 0.24 mu m and a subset of the (U-Th)/He ages range from 45.1 +/- 1.5 to 122.4 +/- 2.5 Ma. Results of inverse thermal history modeling generally support the conclusions from an earlier study for a Late Cretaceous phase of cooling. Around the onshore Taubate Basin, for a limited number of samples, the first detectable period of cooling occurred during the Early Tertiary. The inferred thermal histories for many samples also imply subsequent reheating followed by Neogene cooling. Given the uncertainty of the inversion results, we did deterministic forward modeling to assess the range of possibilities of this Tertiary part of the thermal history. The evidence for reheating seems to be robust around the Taubate Basin, but elsewhere the data cannot discriminate between this and a less complex thermal history. However, forward modeling results and geological information support the conclusion that the whole area underwent cooling during the Neogene. The synchronicity of the cooling phases with Andean tectonics and those in NE Brazil leads us to assume a plate-wide compressional stress that reactivated inherited structures. The present-day topographic relief of the margin reflects a contribution from post-breakup reactivation and uplift.
Resumo:
Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.
Resumo:
Throughout the twentieth century statistical methods have increasingly become part of experimental research. In particular, statistics has made quantification processes meaningful in the soft sciences, which had traditionally relied on activities such as collecting and describing diversity rather than timing variation. The thesis explores this change in relation to agriculture and biology, focusing on analysis of variance and experimental design, the statistical methods developed by the mathematician and geneticist Ronald Aylmer Fisher during the 1920s. The role that Fisher’s methods acquired as tools of scientific research, side by side with the laboratory equipment and the field practices adopted by research workers, is here investigated bottom-up, beginning with the computing instruments and the information technologies that were the tools of the trade for statisticians. Four case studies show under several perspectives the interaction of statistics, computing and information technologies, giving on the one hand an overview of the main tools – mechanical calculators, statistical tables, punched and index cards, standardised forms, digital computers – adopted in the period, and on the other pointing out how these tools complemented each other and were instrumental for the development and dissemination of analysis of variance and experimental design. The period considered is the half-century from the early 1920s to the late 1960s, the institutions investigated are Rothamsted Experimental Station and the Galton Laboratory, and the statisticians examined are Ronald Fisher and Frank Yates.
Resumo:
The atmosphere is a global influence on the movement of heat and humidity between the continents, and thus significantly affects climate variability. Information about atmospheric circulation are of major importance for the understanding of different climatic conditions. Dust deposits from maar lakes and dry maars from the Eifel Volcanic Field (Germany) are therefore used as proxy data for the reconstruction of past aeolian dynamics.rnrnIn this thesis past two sediment cores from the Eifel region are examined: the core SM3 from Lake Schalkenmehren and the core DE3 from the Dehner dry maar. Both cores contain the tephra of the Laacher See eruption, which is dated to 12,900 before present. Taken together the cores cover the last 60,000 years: SM3 the Holocene and DE3 the marine isotope stages MIS-3 and MIS-2, respectively. The frequencies of glacial dust storm events and their paleo wind direction are detected by high resolution grain size and provenance analysis of the lake sediments. Therefore two different methods are applied: geochemical measurements of the sediment using µXRF-scanning and the particle analysis method RADIUS (rapid particle analysis of digital images by ultra-high-resolution scanning of thin sections).rnIt is shown that single dust layers in the lake sediment are characterized by an increased content of aeolian transported carbonate particles. The limestone-bearing Eifel-North-South zone is the most likely source for the carbonate rich aeolian dust in the lake sediments of the Dehner dry maar. The dry maar is located on the western side of the Eifel-North-South zone. Thus, carbonate rich aeolian sediment is most likely to be transported towards the Dehner dry maar within easterly winds. A methodology is developed which limits the detection to the aeolian transported carbonate particles in the sediment, the RADIUS-carbonate module.rnrnIn summary, during the marine isotope stage MIS-3 the storm frequency and the east wind frequency are both increased in comparison to MIS-2. These results leads to the suggestion that atmospheric circulation was affected by more turbulent conditions during MIS-3 in comparison to the more stable atmospheric circulation during the full glacial conditions of MIS-2.rnThe results of the investigations of the dust records are finally evaluated in relation a study of atmospheric general circulation models for a comprehensive interpretation. Here, AGCM experiments (ECHAM3 and ECHAM4) with different prescribed SST patterns are used to develop a synoptic interpretation of long-persisting east wind conditions and of east wind storm events, which are suggested to lead to an enhanced accumulation of sediment being transported by easterly winds to the proxy site of the Dehner dry maar.rnrnThe basic observations made on the proxy record are also illustrated in the 10 m-wind vectors in the different model experiments under glacial conditions with different prescribed sea surface temperature patterns. Furthermore, the analysis of long-persisting east wind conditions in the AGCM data shows a stronger seasonality under glacial conditions: all the different experiments are characterized by an increase of the relative importance of the LEWIC during spring and summer. The different glacial experiments consistently show a shift from a long-lasting high over the Baltic Sea towards the NW, directly above the Scandinavian Ice Sheet, together with contemporary enhanced westerly circulation over the North Atlantic.rnrnThis thesis is a comprehensive analysis of atmospheric circulation patterns during the last glacial period. It has been possible to reconstruct important elements of the glacial paleo climate in Central Europe. While the proxy data from sediment cores lead to a binary signal of the wind direction changes (east versus west wind), a synoptic interpretation using atmospheric circulation models is successful. This shows a possible distribution of high and low pressure areas and thus the direction and strength of wind fields which have the capacity to transport dust. In conclusion, the combination of numerical models, to enhance understanding of processes in the climate system, with proxy data from the environmental record is the key to a comprehensive approach to paleo climatic reconstruction.rn
Resumo:
During the past 20 years or so, more has become known about the properties of khat, its pharmacology, physiological and psychological effects on humans. However, at the same time its reputation of social and recreational use in traditional contexts has hindered the dissemination of knowledge about its detrimental effects in terms of mortality. This paper focuses on this particular deficit and adds to the knowledge-base by reviewing the scant literature that does exist on mortality associated with the trade and use of khat. We sought all peer-reviewed papers relating to deaths associated with khat. From an initial list of 111, we identified 15 items meeting our selection criteria. Examination of these revealed 61 further relevant items. These were supplemented with published reports, newspaper and other media reports. A conceptual framework was then developed for classifying mortality associated with each stage of the plant's journey from its cultivation, transportation, consumption, to its effects on the human body. The model is demonstrated with concrete examples drawn from the above sources. These highlight a number of issues for which more substantive statistical data are needed, including population-based studies of the physiological and psychological determinants of khat-related fatalities. Khat-consuming communities, and health professionals charged with their care should be more aware of the physiological and psychological effects of khat, together with the risks for morbidity and mortality associated with its use. There is also a need for information to be collected at international and national levels on other causes of death associated with khat cultivation, transportation, and trade. Both these dimensions need to be understood.
Resumo:
There is a consensus in China that industrialization, urbanization, globalization and information technology will enhance China's urban competitiveness. We have developed a methodology for the analysis of urban competitiveness that we have applied to China's 25 principal cities during three periods from 1990 through 2009. Our model uses data for 12 variables, to which we apply appropriate statistical techniques. We are able to examine the competitiveness of inland cities and those on the coast, how this has changed during the two decades of the study, the competitiveness of Mega Cities and of administrative centres, and the importance of each variable in explaining urban competitiveness and its development over time. This analysis will be of benefit to Chinese planners as they seek to enhance the competitiveness of China and its major cities in the future.
Resumo:
The single-electron transistor (SET) is one of the best candidates for future nano electronic circuits because of its ultralow power consumption, small size and unique functionality. SET devices operate on the principle of Coulomb blockade, which is more prominent at dimensions of a few nano meters. Typically, the SET device consists of two capacitively coupled ultra-small tunnel junctions with a nano island between them. In order to observe the Coulomb blockade effects in a SET device the charging energy of the device has to be greater that the thermal energy. This condition limits the operation of most of the existing SET devices to cryogenic temperatures. Room temperature operation of SET devices requires sub-10nm nano-islands due to the inverse dependence of charging energy on the radius of the conducting nano-island. Fabrication of sub-10nm structures using lithography processes is still a technological challenge. In the present investigation, Focused Ion Beam based etch and deposition technology is used to fabricate single electron transistors devices operating at room temperature. The SET device incorporates an array of tungsten nano-islands with an average diameter of 8nm. The fabricated devices are characterized at room temperature and clear Coulomb blockade and Coulomb oscillations are observed. An improvement in the resolution limitation of the FIB etching process is demonstrated by optimizing the thickness of the active layer. SET devices with structural and topological variation are developed to explore their impact on the behavior of the device. The threshold voltage of the device was minimized to ~500mV by minimizing the source-drain gap of the device to 17nm. Vertical source and drain terminals are fabricated to realize single-dot based SET device. A unique process flow is developed to fabricate Si dot based SET devices for better gate controllability in the device characteristic. The device vi parameters of the fabricated devices are extracted by using a conductance model. Finally, characteristic of these devices are validated with the simulated data from theoretical modeling.
Resumo:
Earth observations (EO) represent a growing and valuable resource for many scientific, research and practical applications carried out by users around the world. Access to EO data for some applications or activities, like climate change research or emergency response activities, becomes indispensable for their success. However, often EO data or products made of them are (or are claimed to be) subject to intellectual property law protection and are licensed under specific conditions regarding access and use. Restrictive conditions on data use can be prohibitive for further work with the data. Global Earth Observation System of Systems (GEOSS) is an initiative led by the Group on Earth Observations (GEO) with the aim to provide coordinated, comprehensive, and sustained EO and information for making informed decisions in various areas beneficial to societies, their functioning and development. It seeks to share data with users world-wide with the fewest possible restrictions on their use by implementing GEOSS Data Sharing Principles adopted by GEO. The Principles proclaim full and open exchange of data shared within GEOSS, while recognising relevant international instruments and national policies and legislation through which restrictions on the use of data may be imposed.The paper focuses on the issue of the legal interoperability of data that are shared with varying restrictions on use with the aim to explore the options of making data interoperable. The main question it addresses is whether the public domain or its equivalents represent the best mechanism to ensure legal interoperability of data. To this end, the paper analyses legal protection regimes and their norms applicable to EO data. Based on the findings, it highlights the existing public law statutory, regulatory, and policy approaches, as well as private law instruments, such as waivers, licenses and contracts, that may be used to place the datasets in the public domain, or otherwise make them publicly available for use and re-use without restrictions. It uses GEOSS and the particular characteristics of it as a system to identify the ways to reconcile the vast possibilities it provides through sharing of data from various sources and jurisdictions on the one hand, and the restrictions on the use of the shared resources on the other. On a more general level the paper seeks to draw attention to the obstacles and potential regulatory solutions for sharing factual or research data for the purposes that go beyond research and education.
Resumo:
Abstract. Ancient Lake Ohrid is a steep-sided, oligotrophic, karst lake that was tectonically formed most likely within the Pliocene and often referred to as a hotspot of endemic biodiversity. This study aims on tracing significant lake level fluctuations at Lake Ohrid using high-resolution acoustic data in combination with lithological, geochemical, and chronological information from two sediment cores recovered from sub-aquatic terrace levels at ca. 32 and 60m water depth. According to our data, significant lake level fluctuations with prominent lowstands of ca. 60 and 35m below the present water level occurred during Marine Isotope Stage (MIS) 6 and MIS 5, respectively. The effect of these lowstands on biodiversity in most coastal parts of the lake is negligible, due to only small changes in lake surface area, coastline, and habitat. In contrast, biodiversity in shallower areas was more severely affected due to disconnection of today sublacustrine springs from the main water body. Multichannel seismic data from deeper parts of the lake clearly image several clinoform structures stacked on top of each other. These stacked clinoforms indicate significantly lower lake levels prior to MIS 6 and a stepwise rise of water level with intermittent stillstands since its existence as water-filled body, which might have caused enhanced expansion of endemic species within Lake Ohrid.
Resumo:
Purpose: To develop an interdisciplinary course to teach dental students about evidence-based dentistry, development of search strategies, critical appraisal of literature, and dental informatics. [See PDF for complete abstract]
Resumo:
The Business and Information Technologies (BIT) project strives to reveal new insights into how modern IT impacts organizational structures and business practices using empirical methods. Due to its international scope, it allows for inter-country comparison of empirical results. Germany — represented by the European School of Management and Technologies (ESMT) and the Institute of Information Systems at Humboldt-Universität zu Berlin — joined the BIT project in 2006. This report presents the result of the first survey conducted in Germany during November–December 2006. The key results are as follows: • The most widely adopted technologies and systems in Germany are websites, wireless hardware and software, groupware/productivity tools, and enterprise resource planning (ERP) systems. The biggest potential for growth exists for collaboration and portal tools, content management systems, business process modelling, and business intelligence applications. A number of technological solutions have not yet been adopted by many organizations but also bear some potential, in particular identity management solutions, Radio Frequency Identification (RFID), biometrics, and third-party authentication and verification. • IT security remains on the top of the agenda for most enterprises: budget spending was increasing in the last 3 years. • The workplace and work requirements are changing. IT is used to monitor employees' performance in Germany, but less heavily compared to the United States (Karmarkar and Mangal, 2007).1 The demand for IT skills is increasing at all corporate levels. Executives are asking for more and better structured information and this, in turn, triggers the appearance of new decision-making tools and online technologies on the market. • The internal organization of companies in Germany is underway: organizations are becoming flatter, even though the trend is not as pronounced as in the United States (Karmarkar and Mangal, 2007), and the geographical scope of their operations is increasing. Modern IT plays an important role in enabling this development, e.g. telecommuting, teleconferencing, and other web-based collaboration formats are becoming increasingly popular in the corporate context. • The degree to which outsourcing is being pursued is quite limited with little change expected. IT services, payroll, and market research are the most widely outsourced business functions. This corresponds to the results from other countries. • Up to now, the adoption of e-business technologies has had a rather limited effect on marketing functions. Companies tend to extract synergies from traditional printed media and on-line advertising. • The adoption of e-business has not had a major impact on marketing capabilities and strategy yet. Traditional methods of customer segmentation are still dominating. The corporate identity of most organizations does not change significantly when going online. • Online sales channel are mainly viewed as a complement to the traditional distribution means. • Technology adoption has caused production and organizational costs to decrease. However, the costs of technology acquisition and maintenance as well as consultancy and internal communication costs have increased.
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.
Resumo:
Many technological developments of the past two decades come with the promise of greater IT flexi-bility, i.e. greater capacity to adapt IT. These technologies are increasingly used to improve organiza-tional routines that are not affected by large, hard-to-change IT such as ERP. Yet, most findings on the interaction of routines and IT stem from contexts where IT is hard to change. Our research ex-plores how routines and IT co-evolve when IT is flexible. We review the literatures on routines to sug-gest that IT may act as a boundary object that mediates the learning process unfolding between the ostensive and the performative aspect of the routine. Although prior work has concluded from such conceptualizations that IT stabilizes routines, we qualify that flexible IT can also stimulate change because it enables learning in short feedback cycles. We suggest that, however, such change might not always materialize because it is contingent on governance choices and technical knowledge. We de-scribe the case-study method to explore how routines and flexible IT co-evolve and how governance and technical knowledge influence this process. We expect to contribute towards stronger theory of routines and to develop recommendations for the effective implementation of flexible IT in loosely coupled routines.
Resumo:
BACKGROUND Record linkage of existing individual health care data is an efficient way to answer important epidemiological research questions. Reuse of individual health-related data faces several problems: Either a unique personal identifier, like social security number, is not available or non-unique person identifiable information, like names, are privacy protected and cannot be accessed. A solution to protect privacy in probabilistic record linkages is to encrypt these sensitive information. Unfortunately, encrypted hash codes of two names differ completely if the plain names differ only by a single character. Therefore, standard encryption methods cannot be applied. To overcome these challenges, we developed the Privacy Preserving Probabilistic Record Linkage (P3RL) method. METHODS In this Privacy Preserving Probabilistic Record Linkage method we apply a three-party protocol, with two sites collecting individual data and an independent trusted linkage center as the third partner. Our method consists of three main steps: pre-processing, encryption and probabilistic record linkage. Data pre-processing and encryption are done at the sites by local personnel. To guarantee similar quality and format of variables and identical encryption procedure at each site, the linkage center generates semi-automated pre-processing and encryption templates. To retrieve information (i.e. data structure) for the creation of templates without ever accessing plain person identifiable information, we introduced a novel method of data masking. Sensitive string variables are encrypted using Bloom filters, which enables calculation of similarity coefficients. For date variables, we developed special encryption procedures to handle the most common date errors. The linkage center performs probabilistic record linkage with encrypted person identifiable information and plain non-sensitive variables. RESULTS In this paper we describe step by step how to link existing health-related data using encryption methods to preserve privacy of persons in the study. CONCLUSION Privacy Preserving Probabilistic Record linkage expands record linkage facilities in settings where a unique identifier is unavailable and/or regulations restrict access to the non-unique person identifiable information needed to link existing health-related data sets. Automated pre-processing and encryption fully protect sensitive information ensuring participant confidentiality. This method is suitable not just for epidemiological research but also for any setting with similar challenges.
Resumo:
This book attempts to synthesize research that contributes to a better understanding of how to reach sustainable business value through information systems (IS) outsourcing. Important topics in this realm are how IS outsourcing can contribute to innovation, how it can be dynamically governed, how to cope with its increasing complexity through multi-vendor arrangements, how service quality standards can be met, how corporate social responsibility can be upheld, and how to cope with increasing demands of internationalization and new sourcing models, such as crowdsourcing and platform-based cooperation. These issues are viewed from either the client or vendor perspective, or both. The book should be of interest to all academics and students in the fields of Information Systems, Management, and Organization as well as corporate executives and professionals who seek a more profound analysis and understanding of the underlying factors and mechanisms of outsourcing.