994 resultados para Distributed Digital Preservation
Resumo:
Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. This thesis describes a heterogeneous database system being developed at Highperformance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i.) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii.) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii.) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv.) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v.) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi.) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii.) a framework for intelligent computing and communication on the Internet applying the concepts of our work.
Resumo:
In certain European countries and the United States of America, canines have been successfully used in human scent identification. There is however, limited scientific knowledge on the composition of human scent and the detection mechanism that produces an alert from canines. This lack of information has resulted in successful legal challenges to human scent evidence in the courts of law. The main objective of this research was to utilize science to validate the current practices of using human scent evidence in criminal cases. The goals of this study were to utilize Headspace Solid Phase Micro Extraction Gas Chromatography Mass Spectrometry (HS-SPME-GC/MS) to determine the optimum collection and storage conditions for human scent samples, to investigate whether the amount of DNA deposited upon contact with an object affects the alerts produced by human scent identification canines, and to create a prototype pseudo human scent which could be used for training purposes. Hand odor samples which were collected on different sorbent materials and exposed to various environmental conditions showed that human scent samples should be stored without prolonged exposure to UVA/UVB light to allow minimal changes to the overall scent profile. Various methods of collecting human scent from objects were also investigated and it was determined that passive collection methods yields ten times more VOCs by mass than active collection methods. Through the use of polymerase chain reaction (PCR) no correlation was found between the amount of DNA that was deposited upon contact with an object and the alerts that were produced by human scent identification canines. Preliminary studies conducted to create a prototype pseudo human scent showed that it is possible to produce fractions of a human scent sample which can be presented to the canines to determine whether specific fractions or the entire sample is needed to produce alerts by the human scent identification canines.
Resumo:
MEDEIROS, Rildeci; MELO, Erica S. F.; NASCIMENTO, M. S. Hemeroteca digital temática: socialização da informação em cinema.In:SEMINÁRIO NACIONAL DE BIBLIOTECAS UNIVERSITÁRIAS,15.,2008,São Paulo. Anais eletrônicos... São Paulo:CRUESP,2008. Disponível em: http://www.sbu.unicamp.br/snbu2008/anais/site/pdfs/3018.pdf
Resumo:
Presentation at the IIPC General Assembly, Reykjavik, 12 April, 2016
Resumo:
The cytokine hormone leptin is a key signalling molecule in many pathways that control physiological functions. Although leptin demonstrates structural conservation in mammals, there is evidence of positive selection in primates, lagomorphs and chiropterans. We previously reported that the leptin genes of the grey and harbour seals (phocids) have significantly diverged from other mammals. Therefore we further investigated the diversification of leptin in phocids, other marine mammals and terrestrial taxa by sequencing the leptin genes of representative species. Phylogenetic reconstruction revealed that leptin diversification was pronounced within the phocid seals with a high dN/dS ratio of 2.8, indicating positive selection. We found significant evidence of positive selection along the branch leading to the phocids, within the phocid clade, but not over the dataset as a whole. Structural predictions indicate that the individual residues under selection are away from the leptin receptor (LEPR) binding site. Predictions of the surface electrostatic potential indicate that phocid seal leptin is notably different to other mammalian leptins, including the otariids. Cloning the grey seal leptin binding domain of LEPR confirmed that this was structurally conserved. These data, viewed in toto, support a hypothesis that phocid leptin divergence is unlikely to have arisen by random mutation. Based upon these phylogenetic and structural assessments, and considering the comparative physiology and varying life histories among species, we postulate that the unique phocid diving behaviour has produced this selection pressure. The Phocidae includes some of the deepest diving species, yet have the least modified lung structure to cope with pressure and volume changes experienced at depth. Therefore, greater surfactant production is required to facilitate rapid lung re-inflation upon surfacing, while maintaining patent airways. We suggest that this additional surfactant requirement is met by the leptin pulmonary surfactant production pathway which normally appears only to function in the mammalian foetus.
Resumo:
Master final project submitted to the faculty of the Historic Preservation Program of the School of Architecture, Planning, and Preservation of the University of Maryland, College Park, in partial fulfillment of the requirements for the degree of Master of Historic Preservation, 2013.
Resumo:
MEDEIROS, Rildeci; MELO, Erica S. F.; NASCIMENTO, M. S. Hemeroteca digital temática: socialização da informação em cinema.In:SEMINÁRIO NACIONAL DE BIBLIOTECAS UNIVERSITÁRIAS,15.,2008,São Paulo. Anais eletrônicos... São Paulo:CRUESP,2008. Disponível em: http://www.sbu.unicamp.br/snbu2008/anais/site/pdfs/3018.pdf
Resumo:
We propose three research problems to explore the relations between trust and security in the setting of distributed computation. In the first problem, we study trust-based adversary detection in distributed consensus computation. The adversaries we consider behave arbitrarily disobeying the consensus protocol. We propose a trust-based consensus algorithm with local and global trust evaluations. The algorithm can be abstracted using a two-layer structure with the top layer running a trust-based consensus algorithm and the bottom layer as a subroutine executing a global trust update scheme. We utilize a set of pre-trusted nodes, headers, to propagate local trust opinions throughout the network. This two-layer framework is flexible in that it can be easily extensible to contain more complicated decision rules, and global trust schemes. The first problem assumes that normal nodes are homogeneous, i.e. it is guaranteed that a normal node always behaves as it is programmed. In the second and third problems however, we assume that nodes are heterogeneous, i.e, given a task, the probability that a node generates a correct answer varies from node to node. The adversaries considered in these two problems are workers from the open crowd who are either investing little efforts in the tasks assigned to them or intentionally give wrong answers to questions. In the second part of the thesis, we consider a typical crowdsourcing task that aggregates input from multiple workers as a problem in information fusion. To cope with the issue of noisy and sometimes malicious input from workers, trust is used to model workers' expertise. In a multi-domain knowledge learning task, however, using scalar-valued trust to model a worker's performance is not sufficient to reflect the worker's trustworthiness in each of the domains. To address this issue, we propose a probabilistic model to jointly infer multi-dimensional trust of workers, multi-domain properties of questions, and true labels of questions. Our model is very flexible and extensible to incorporate metadata associated with questions. To show that, we further propose two extended models, one of which handles input tasks with real-valued features and the other handles tasks with text features by incorporating topic models. Our models can effectively recover trust vectors of workers, which can be very useful in task assignment adaptive to workers' trust in the future. These results can be applied for fusion of information from multiple data sources like sensors, human input, machine learning results, or a hybrid of them. In the second subproblem, we address crowdsourcing with adversaries under logical constraints. We observe that questions are often not independent in real life applications. Instead, there are logical relations between them. Similarly, workers that provide answers are not independent of each other either. Answers given by workers with similar attributes tend to be correlated. Therefore, we propose a novel unified graphical model consisting of two layers. The top layer encodes domain knowledge which allows users to express logical relations using first-order logic rules and the bottom layer encodes a traditional crowdsourcing graphical model. Our model can be seen as a generalized probabilistic soft logic framework that encodes both logical relations and probabilistic dependencies. To solve the collective inference problem efficiently, we have devised a scalable joint inference algorithm based on the alternating direction method of multipliers. The third part of the thesis considers the problem of optimal assignment under budget constraints when workers are unreliable and sometimes malicious. In a real crowdsourcing market, each answer obtained from a worker incurs cost. The cost is associated with both the level of trustworthiness of workers and the difficulty of tasks. Typically, access to expert-level (more trustworthy) workers is more expensive than to average crowd and completion of a challenging task is more costly than a click-away question. In this problem, we address the problem of optimal assignment of heterogeneous tasks to workers of varying trust levels with budget constraints. Specifically, we design a trust-aware task allocation algorithm that takes as inputs the estimated trust of workers and pre-set budget, and outputs the optimal assignment of tasks to workers. We derive the bound of total error probability that relates to budget, trustworthiness of crowds, and costs of obtaining labels from crowds naturally. Higher budget, more trustworthy crowds, and less costly jobs result in a lower theoretical bound. Our allocation scheme does not depend on the specific design of the trust evaluation component. Therefore, it can be combined with generic trust evaluation algorithms.
Resumo:
Advances in digital photography and distribution technologies enable many people to produce and distribute images of their sex acts. When teenagers do this, the photos and videos they create can be legally classified as child pornography since the law makes no exception for youth who create sexually explicit images of themselves. The dominant discussions about teenage girls producing sexually explicit media (including sexting) are profoundly unproductive: (1) they blame teenage girls for creating private images that another person later maliciously distributed and (2) they fail to respect—or even discuss—teenagers’ rights to freedom of expression. Cell phones and the internet make producing and distributing images extremely easy, which provide widely accessible venues for both consensual sexual expression between partners and for sexual harassment. Dominant understandings view sexting as a troubling teenage trend created through the combination of camera phones and adolescent hormones and impulsivity, but this view often conflates consensual sexting between partners with the malicious distribution of a person’s private image as essentially equivalent behaviors. In this project, I ask: What is the role of assumptions about teen girls’ sexual agency in these problematic understandings of sexting that blame victims and deny teenagers’ rights? In contrast to the popular media panic about online predators and the familiar accusation that youth are wasting their leisure time by using digital media, some people champion the internet as a democratic space that offers young people the opportunity to explore identities and develop social and communication skills. Yet, when teen girls’ sexuality enters this conversation, all this debate and discussion narrows to a problematic consensus. The optimists about adolescents and technology fall silent, and the argument that media production is inherently empowering for girls does not seem to apply to a girl who produces a sexually explicit image of herself. Instead, feminist, popular, and legal commentaries assert that she is necessarily a victim: of a “sexualized” mass media, pressure from her male peers, digital technology, her brain structures or hormones, or her own low self-esteem and misplaced desire for attention. Why and how are teenage girls’ sexual choices produced as evidence of their failure or success in achieving Western liberal ideals of self-esteem, resistance, and agency? Since mass media and policy reactions to sexting have so far been overwhelmingly sexist and counter-productive, it is crucial to interrogate the concepts and assumptions that characterize mainstream understandings of sexting. I argue that the common sense that is co-produced by law and mass media underlies the problematic legal and policy responses to sexting. Analyzing a range of nonfiction texts including newspaper articles, talk shows, press releases, public service announcements, websites, legislative debates, and legal documents, I investigate gendered, racialized, age-based, and technologically determinist common sense assumptions about teenage girls’ sexual agency. I examine the consensus and continuities that exist between news, nonfiction mass media, policy, institutions, and law, and describe the limits of their debates. I find that this early 21st century post-feminist girl-power moment not only demands that girls live up to gendered sexual ideals but also insists that actively choosing to follow these norms is the only way to exercise sexual agency. This is the first study to date examining the relationship of conventional wisdom about digital media and teenage girls’ sexuality to both policy and mass media.
Resumo:
Presentation from the MARAC conference in Roanoke, VA on October 7–10, 2015. S8 - Minimal Processing and Preservation: Friends or Foes?
Resumo:
Presented to the Preservation Section, Society of American Archivists, 2016 Annual Conference, Atlanta, Georgia.
Resumo:
“Breaking through the Margins: Pushing Sociopolitical Boundaries Through Historic Preservation” explores the ways in which contemporary grassroots organizations are adapting historic preservation methods to protect African American heritage in communities that are on the brink of erasure. This project emerges from an eighteen-month longitudinal study of three African American preservation organizations—one in College Park, Maryland and two in Houston, Texas—where gentrification or suburban sprawl has all but decimated the physical landscape of their communities. Grassroots preservationists in Lakeland (College Park, Maryland), St. John Baptist Church (Missouri City, Texas), and Freedmen’s Town (Houston, Texas) are involved in pushing back against preservation practices that do not, or tend not, to take into consideration the narratives of African American communities. I argue, these organizations practice a form of preservation that provides immediate and lasting effects for communities hovering at the margins. This dissertation seeks to outline some of the major methodological approaches taken by Lakeland, St. John, and Freedmen’s Town. The preservation efforts put forth by the grassroots organizations in these communities faithfully work to remind us that history without preservation is lost. In taking on the critical work of pursuing social justice, these grassroots organizations are breaking through the margins of society using historic preservation as their medium.
Resumo:
Nitric Oxide (NO) has been known for long to regulate vessel tone. However, the close proximity of the site of NO production to “sinks” of NO such as hemoglobin (Hb) in blood suggest that blood will scavenge most of the NO produced. Therefore, it is unclear how NO is able to play its physiological roles. The current study deals with means by which this could be understood. Towards studying the role of nitrosothiols and nitrite in preserving NO availability, a study of the kinetics of glutathione (GSH) nitrosation by NO donors in aerated buffered solutions was undertaken first. Results suggest an increase in the rate of the corresponding nitrosothiol (GSNO) formation with an increase in GSH with a half-maximum constant EC50 that depends on NO concentration, thus indicating a significant contribution of ∙NO2 mediated nitrosation in the production of GSNO. Next, the ability of nitrite to be reduced to NO in the smooth muscle cells was evaluated. The NO formed was inhibited by sGC inhibitors and accelerated by activators and was independent of O2 concentration. Nitrite transport mechanisms and effects of exogenous nitrate on transport and reduction of nitrite were examined. The results showed that sGC can mediate nitrite reduction to NO and nitrite is transported across the smooth muscle cell membrane via anion channels, both of which can be attenuated by nitrate. Finally, a 2 – D axisymmetric diffusion model was constructed to test the accumulation of NO in the smooth muscle layer from reduction of nitrite. It was observed that at the end of the simulation period with physiological concentrations of nitrite in the smooth muscle cells (SMC), a low sustained NO generated from nitrite reduction could maintain significant sGC activity and might affect vessel tone. The major nitrosating mechanism in the circulation at reduced O2 levels was found to be anaerobic and a Cu+ dependent GSNO reduction activity was found to deliver minor amounts of NO from physiological GSNO levels in the tissue.
Resumo:
Efficient and reliable techniques for power delivery and utilization are needed to account for the increased penetration of renewable energy sources in electric power systems. Such methods are also required for current and future demands of plug-in electric vehicles and high-power electronic loads. Distributed control and optimal power network architectures will lead to viable solutions to the energy management issue with high level of reliability and security. This dissertation is aimed at developing and verifying new techniques for distributed control by deploying DC microgrids, involving distributed renewable generation and energy storage, through the operating AC power system. To achieve the findings of this dissertation, an energy system architecture was developed involving AC and DC networks, both with distributed generations and demands. The various components of the DC microgrid were designed and built including DC-DC converters, voltage source inverters (VSI) and AC-DC rectifiers featuring novel designs developed by the candidate. New control techniques were developed and implemented to maximize the operating range of the power conditioning units used for integrating renewable energy into the DC bus. The control and operation of the DC microgrids in the hybrid AC/DC system involve intelligent energy management. Real-time energy management algorithms were developed and experimentally verified. These algorithms are based on intelligent decision-making elements along with an optimization process. This was aimed at enhancing the overall performance of the power system and mitigating the effect of heavy non-linear loads with variable intensity and duration. The developed algorithms were also used for managing the charging/discharging process of plug-in electric vehicle emulators. The protection of the proposed hybrid AC/DC power system was studied. Fault analysis and protection scheme and coordination, in addition to ideas on how to retrofit currently available protection concepts and devices for AC systems in a DC network, were presented. A study was also conducted on the effect of changing the distribution architecture and distributing the storage assets on the various zones of the network on the system’s dynamic security and stability. A practical shipboard power system was studied as an example of a hybrid AC/DC power system involving pulsed loads. Generally, the proposed hybrid AC/DC power system, besides most of the ideas, controls and algorithms presented in this dissertation, were experimentally verified at the Smart Grid Testbed, Energy Systems Research Laboratory. All the developments in this dissertation were experimentally verified at the Smart Grid Testbed.