983 resultados para Information Structures
Resumo:
The Business and Information Technologies (BIT) project strives to reveal new insights into how modern IT impacts organizational structures and business practices using empirical methods. Due to its international scope, it allows for inter-country comparison of empirical results. Germany — represented by the European School of Management and Technologies (ESMT) and the Institute of Information Systems at Humboldt-Universität zu Berlin — joined the BIT project in 2006. This report presents the result of the first survey conducted in Germany during November–December 2006. The key results are as follows: • The most widely adopted technologies and systems in Germany are websites, wireless hardware and software, groupware/productivity tools, and enterprise resource planning (ERP) systems. The biggest potential for growth exists for collaboration and portal tools, content management systems, business process modelling, and business intelligence applications. A number of technological solutions have not yet been adopted by many organizations but also bear some potential, in particular identity management solutions, Radio Frequency Identification (RFID), biometrics, and third-party authentication and verification. • IT security remains on the top of the agenda for most enterprises: budget spending was increasing in the last 3 years. • The workplace and work requirements are changing. IT is used to monitor employees' performance in Germany, but less heavily compared to the United States (Karmarkar and Mangal, 2007).1 The demand for IT skills is increasing at all corporate levels. Executives are asking for more and better structured information and this, in turn, triggers the appearance of new decision-making tools and online technologies on the market. • The internal organization of companies in Germany is underway: organizations are becoming flatter, even though the trend is not as pronounced as in the United States (Karmarkar and Mangal, 2007), and the geographical scope of their operations is increasing. Modern IT plays an important role in enabling this development, e.g. telecommuting, teleconferencing, and other web-based collaboration formats are becoming increasingly popular in the corporate context. • The degree to which outsourcing is being pursued is quite limited with little change expected. IT services, payroll, and market research are the most widely outsourced business functions. This corresponds to the results from other countries. • Up to now, the adoption of e-business technologies has had a rather limited effect on marketing functions. Companies tend to extract synergies from traditional printed media and on-line advertising. • The adoption of e-business has not had a major impact on marketing capabilities and strategy yet. Traditional methods of customer segmentation are still dominating. The corporate identity of most organizations does not change significantly when going online. • Online sales channel are mainly viewed as a complement to the traditional distribution means. • Technology adoption has caused production and organizational costs to decrease. However, the costs of technology acquisition and maintenance as well as consultancy and internal communication costs have increased.
Resumo:
The importance of long-term historical information derived from paleoecological studies has long been recognized as a fundamental aspect of effective conservation. However, there remains some uncertainty regarding the extent to which paleoecology can inform on specific issues of high conservation priority, at the scale for which conservation policy decisions often take place. Here we review to what extent the past occurrence of three fundamental aspects of forest conservation can be assessed using paleoecological data, with a focus on northern Europe. These aspects are (1) tree species composition, (2) old/large trees and coarse woody debris, and (3) natural disturbances. We begin by evaluating the types of relevant historical information available from contemporary forests, then evaluate common paleoecological techniques, namely dendrochronology, pollen, macrofossil, charcoal, and fossil insect and wood analyses. We conclude that whereas contemporary forests can be used to estimate historical, natural occurrences of several of the aspects addressed here (e.g. old/large trees), paleoecological techniques are capable of providing much greater temporal depth, as well as robust quantitative data for tree species composition and fire disturbance, qualitative insights regarding old/large trees and woody debris, but limited indications of past windstorms and insect outbreaks. We also find that studies of fossil wood and paleoentomology are perhaps the most underutilized sources of information. Not only can paleoentomology provide species specific information, but it also enables the reconstruction of former environmental conditions otherwise unavailable. Despite the potential, the majority of conservation-relevant paleoecological studies primarily focus on describing historical forest conditions in broad terms and for large spatial scales, addressing former climate, land-use, and landscape developments, often in the absence of a specific conservation context. In contrast, relatively few studies address the most pressing conservation issues in northern Europe, often requiring data on the presence or quantities of dead wood, large trees or specific tree species, at the scale of the stand or reserve. Furthermore, even fewer examples exist of detailed paleoecological data being used for conservation planning, or the setting of operative restorative baseline conditions at local scales. If ecologist and conservation biologists are going to benefit to the full extent possible from the ever-advancing techniques developed by the paleoecological sciences, further integration of these disciplines is desirable.
Resumo:
Kaposi's sarcoma-associated herpesvirus (KSHV) is a recently discovered DNA tumor virus that belongs to the gamma-herpesvirus subfamily. Though numerous studies on KSHV and other herpesviruses, in general, have revealed much about their multilayered organization and capsid structure, the herpesvirus capsid assembly and maturation pathway remains poorly understood. Structural variability or irregularity of the capsid internal scaffolding core and the lack of adequate tools to study such structures have presented major hurdles to earlier investigations employing more traditional cryo-electron microscopy (cryoEM) single particle reconstruction. In this study, we used cryo-electron tomography (cryoET) to obtain 3D reconstructions of individual KSHV capsids, allowing direct visualization of the capsid internal structures and systematic comparison of the scaffolding cores for the first time. We show that B-capsids are not a structurally homogenous group; rather, they represent an ensemble of "B-capsid-like" particles whose inner scaffolding is highly variable, possibly representing different intermediates existing during the KSHV capsid assembly and maturation. This information, taken together with previous observations, has allowed us to propose a detailed pathway of herpesvirus capsid assembly and maturation.
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.
Resumo:
In this paper, we describe dynamic unicast to increase communication efficiency in opportunistic Information-centric networks. The approach is based on broadcast requests to quickly find content and dynamically creating unicast links to content sources without the need of neighbor discovery. The links are kept temporarily as long as they deliver content and are quickly removed otherwise. Evaluations in mobile networks show that this approach maintains ICN flexibility to support seamless mobile communication and achieves up to 56.6% shorter transmission times compared to broadcast in case of multiple concurrent requesters. Apart from that, dynamic unicast unburdens listener nodes from processing unwanted content resulting in lower processing overhead and power consumption at these nodes. The approach can be easily included into existing ICN architectures using only available data structures.
Resumo:
High-resolution structural information on optimally preserved bacterial cells can be obtained with cryo-electron microscopy of vitreous sections. With the help of this technique, the existence of a periplasmic space between the plasma membrane and the thick peptidoglycan layer of the gram-positive bacteria Bacillus subtilis and Staphylococcus aureus was recently shown. This raises questions about the mode of polymerization of peptidoglycan. In the present study, we report the structure of the cell envelope of three gram-positive bacteria (B. subtilis, Streptococcus gordonii, and Enterococcus gallinarum). In the three cases, a previously undescribed granular layer adjacent to the plasma membrane is found in the periplasmic space. In order to better understand how nascent peptidoglycan is incorporated into the mature peptidoglycan, we investigated cellular regions known to represent the sites of cell wall production. Each of these sites possesses a specific structure. We propose a hypothetic model of peptidoglycan polymerization that accommodates these differences: peptidoglycan precursors could be exported from the cytoplasm to the periplasmic space, where they could diffuse until they would interact with the interface between the granular layer and the thick peptidoglycan layer. They could then polymerize with mature peptidoglycan. We report cytoplasmic structures at the E. gallinarum septum that could be interpreted as cytoskeletal elements driving cell division (FtsZ ring). Although immunoelectron microscopy and fluorescence microscopy studies have demonstrated the septal and cytoplasmic localization of FtsZ, direct visualization of in situ FtsZ filaments has not been obtained in any electron microscopy study of fixed and dehydrated bacteria.
Resumo:
Retroviruses are RNA viruses that replicate through a double-stranded DNA intermediate. The viral enzyme reverse transcriptase copies the retroviral genomic RNA into this DNA intermediate through the process of reverse transcription. Many variables can affect the fidelity of reverse transcriptase during reverse transcription, including specific sequences within the retroviral genome. ^ Previous studies have observed that multiple cloning sites (MCS) and sequences predicted to form stable hairpin structures are hotspots for deletion during retroviral replication. The studies described in this dissertation were performed to elucidate the variables that affect the stability of MCS and hairpin structures in retroviral vectors. Two series of retroviral vectors were constructed and characterized in these studies. ^ Spleen necrosis virus-based vectors were constructed containing separate MCS insertions of varying length, orientation, and symmetry. The only MCS that was a hotspot for deletion formed a stable hairpin structure. Upon more detailed study, the MCS previously reported as a hotspot for deletion was found to contain a tandem linker insertion that formed a hairpin structure. Murine leukemia virus-based vectors were constructed containing separate sequence insertions of either inverted repeat symmetry (122IR) that could form a hairpin structure, or little symmetry (122c) that would form a less stable structure. These insertions were made into either the neomycin resistance marker ( neo) or the hygromycin resistance marker (hyg) of the vector. 122c was stable in both neo and hyg, while 122IR was preferentially deleted in neo and was remarkably unstable in hyg. ^ These results suggest that MCS are hotspots for deletion in retroviral vectors if they can form hairpin structures, and that hairpin structures can be highly unstable at certain locations in retroviral vectors. This information may contribute to improved design of retroviral vectors for such uses as human gene therapy, and will contribute to a greater understanding of the basic science of retroviral reverse transcription. ^
Resumo:
Digital terrain models (DTM) typically contain large numbers of postings, from hundreds of thousands to billions. Many algorithms that run on DTMs require topological knowledge of the postings, such as finding nearest neighbors, finding the posting closest to a chosen location, etc. If the postings are arranged irregu- larly, topological information is costly to compute and to store. This paper offers a practical approach to organizing and searching irregularly-space data sets by presenting a collection of efficient algorithms (O(N),O(lgN)) that compute important topological relationships with only a simple supporting data structure. These relationships include finding the postings within a window, locating the posting nearest a point of interest, finding the neighborhood of postings nearest a point of interest, and ordering the neighborhood counter-clockwise. These algorithms depend only on two sorted arrays of two-element tuples, holding a planimetric coordinate and an integer identification number indicating which posting the coordinate belongs to. There is one array for each planimetric coordinate (eastings and northings). These two arrays cost minimal overhead to create and store but permit the data to remain arranged irregularly.
Resumo:
Chromatographic fractionation of the cytotoxic n-hexane extract of Hopea odorata Roxb. leaves led to the isolation of eight lupane triterpenes, which constitutes the first report of lupane-type triterpenes from this plant source. Furthermore, 3,30-dioxolup-20(29)-en-28-oic acid (6) was isolated for the first time from a natural source. Their structures were determined on the basis of spectroscopic methods, including 2D NMR analysis, and by comparison of their spectral data with literature values. Complete NMR assignments of the 1H and 13C NMR data were achieved for all compounds. Finally, the cytotoxic activities of the isolated compounds against four human cell lines (PC3, MDA-MB-231, HT-29 and HCT116) was also reported.
Resumo:
We present the data structures and algorithms used in the approach for building domain ontologies from folksonomies and linked data. In this approach we extracts domain terms from folksonomies and enrich them with semantic information from the Linked Open Data cloud. As a result, we obtain a domain ontology that combines the emergent knowledge of social tagging systems with formal knowledge from Ontologies.
Resumo:
This poster raises the issue of a research work oriented to the storage, retrieval, representation and analysis of dynamic GI, taking into account The ultimate objective is the modelling and representation of the dynamic nature of geographic features, establishing mechanisms to store geometries enriched with a temporal structure (regardless of space) and a set of semantic descriptors detailing and clarifying the nature of the represented features and their temporality. the semantic, the temporal and the spatiotemporal components. We intend to define a set of methods, rules and restrictions for the adequate integration of these components into the primary elements of the GI: theme, location, time [1]. We intend to establish and incorporate three new structures (layers) into the core of data storage by using mark-up languages: a semantictemporal structure, a geosemantic structure, and an incremental spatiotemporal structure. Thus, data would be provided with the capability of pinpointing and expressing their own basic and temporal characteristics, enabling them to interact each other according to their context, and their time and meaning relationships that could be eventually established
Resumo:
This poster raises the issue of a research work oriented to the storage, retrieval, representation and analysis of dynamic GI, taking into account the semantic, the temporal and the spatiotemporal components. We intend to define a set of methods, rules and restrictions for the adequate integration of these components into the primary elements of the GI: theme, location, time [1]. We intend to establish and incorporate three new structures (layers) into the core of data storage by using mark-up languages: a semantictemporal structure, a geosemantic structure, and an incremental spatiotemporal structure. The ultimate objective is the modelling and representation of the dynamic nature of geographic features, establishing mechanisms to store geometries enriched with a temporal structure (regardless of space) and a set of semantic descriptors detailing and clarifying the nature of the represented features and their temporality. Thus, data would be provided with the capability of pinpointing and expressing their own basic and temporal characteristics, enabling them to interact each other according to their context, and their time and meaning relationships that could be eventually established
Resumo:
Once admitted the advantages of object-based classification compared to pixel-based classification; the need of simple and affordable methods to define and characterize objects to be classified, appears. This paper presents a new methodology for the identification and characterization of objects at different scales, through the integration of spectral information provided by the multispectral image, and textural information from the corresponding panchromatic image. In this way, it has defined a set of objects that yields a simplified representation of the information contained in the two source images. These objects can be characterized by different attributes that allow discriminating between different spectral&textural patterns. This methodology facilitates information processing, from a conceptual and computational point of view. Thus the vectors of attributes defined can be used directly as training pattern input for certain classifiers, as for example artificial neural networks. Growing Cell Structures have been used to classify the merged information.
Resumo:
Precise modeling of the program heap is fundamental for understanding the behavior of a program, and is thus of signiflcant interest for many optimization applications. One of the fundamental properties of the heap that can be used in a range of optimization techniques is the sharing relationships between the elements in an array or collection. If an analysis can determine that the memory locations pointed to by different entries of an array (or collection) are disjoint, then in many cases loops that traverse the array can be vectorized or transformed into a thread-parallel versión. This paper introduces several novel sharing properties over the concrete heap and corresponding abstractions to represent them. In conjunction with an existing shape analysis technique, these abstractions allow us to precisely resolve the sharing relations in a wide range of heap structures (arrays, collections, recursive data structures, composite heap structures) in a computationally efflcient manner. The effectiveness of the approach is evaluated on a set of challenge problems from the JOlden and SPECjvm98 suites. Sharing information obtained from the analysis is used to achieve substantial thread-level parallel speedups.
Resumo:
Plant nonspecific lipid transfer proteins (nsLTPs) bind a wide variety of lipids, which allows them to perform disparate functions. Recent reports on their multifunctionality in plant growth processes have posed new questions on the versatile binding abilities of these proteins. The lack of binding specificity has been customarily explained in qualitative terms on the basis of a supposed structural flexibility and nonspecificity of hydrophobic protein-ligand interactions. We present here a computational study of protein-ligand complexes formed between five nsLTPs and seven lipids bound in two different ways in every receptor protein. After optimizing geometries inmolecular dynamics calculations, we computed Poisson- Boltzmann electrostatic potentials, solvation energies, properties of the protein-ligand interfaces, and estimates of binding free energies of the resulting complexes. Our results provide the first quantitative information on the ligand abilities of nsLTPs, shed new light into protein-lipid interactions, and reveal new features which supplement commonly held assumptions on their lack of binding specificity.