918 resultados para entity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências Farmacêuticas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prefetching has been shown to be an effective technique for reducing user perceived latency in distributed systems. In this paper we show that even when prefetching adds no extra traffic to the network, it can have serious negative performance effects. Straightforward approaches to prefetching increase the burstiness of individual sources, leading to increased average queue sizes in network switches. However, we also show that applications can avoid the undesirable queueing effects of prefetching. In fact, we show that applications employing prefetching can significantly improve network performance, to a level much better than that obtained without any prefetching at all. This is because prefetching offers increased opportunities for traffic shaping that are not available in the absence of prefetching. Using a simple transport rate control mechanism, a prefetching application can modify its behavior from a distinctly ON/OFF entity to one whose data transfer rate changes less abruptly, while still delivering all data in advance of the user's actual requests.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Growing interest in inference and prediction of network characteristics is justified by its importance for a variety of network-aware applications. One widely adopted strategy to characterize network conditions relies on active, end-to-end probing of the network. Active end-to-end probing techniques differ in (1) the structural composition of the probes they use (e.g., number and size of packets, the destination of various packets, the protocols used, etc.), (2) the entity making the measurements (e.g. sender vs. receiver), and (3) the techniques used to combine measurements in order to infer specific metrics of interest. In this paper, we present Periscope: a Linux API that enables the definition of new probing structures and inference techniques from user space through a flexible interface. PeriScope requires no support from clients beyond the ability to respond to ICMP ECHO REQUESTs and is designed to minimize user/kernel crossings and to ensure various constraints (e.g., back-to-back packet transmissions, fine-grained timing measurements) We show how to use Periscope for two different probing purposes, namely the measurement of shared packet losses between pairs of endpoints and for the measurement of subpath bandwidth. Results from Internet experiments for both of these goals are also presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The observations of Hooke (1665), Schleiden & Schwann (1839) and Virchow (1855) led to the identification of the cell as the basic structural unit of living material. In the intervening years, it has been firmly established that the chemical processes which underlie the proper functioning, development and reproduction of the organism are cellular activities. The development of the electron microscope has enabled cell structure to be studied in detail. A picture of the cell as an entity with a complex and highly organised internal structure has emerged from the work of Palade, Porter, Fernandez-Moran and many others. Although cells from different tissues and organisms differ in aspects of their structure and consequently in function, they have several features in common. A retentive membrane encloses a number of cell constituents, which include membrane-enclosed subcellular structures known as organelles. The cells of most tissues also contain a reticulum or system of branching tubules. The interplay of the biochemical activities of these structures enables the cell to function. Almost thirty years ago, Claude, Palade, Schneider, Hogeboom, de Duve and others set out to analytically fractionate the subcellular components obtained after the fragmentation of liver cells. This approach has become known as subcellular fractionation, and signalled a major conceptual breakthrough in biochemistry (reviewed by de Duve, 1964, 1967, 1971). The significance of this breakthrough has been underlined by the award of the 1974 Nobel Prize in Medicine to de Duve, Palade and Claude. This thesis is concerned with the application of subcellular fractionation techniques to the separation and characterisation of the membrane systems of the rabbit skeletal muscle cell.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Comfort is, in essence, satisfaction with the environment, and with respect to the indoor environment it is primarily satisfaction with the thermal conditions and air quality. Improving comfort has social, health and economic benefits, and is more financially significant than any other building cost. Despite this, comfort is not strictly managed throughout the building lifecycle. This is mainly due to the lack of an appropriate system to adequately manage comfort knowledge through the construction process into operation. Previous proposals to improve knowledge management have not been successfully adopted by the construction industry. To address this, the BabySteps approach was devised. BabySteps is an approach, proposed by this research, which states that for an innovation to be adopted into the industry it must be implementable through a number of small changes. This research proposes that improving the management of comfort knowledge will improve comfort. ComMet is a new methodology proposed by this research that manages comfort knowledge. It enables comfort knowledge to be captured, stored and accessed throughout the building life-cycle and so allowing it to be re-used in future stages of the building project and in future projects. It does this using the following: Comfort Performances – These are simplified numerical representations of the comfort of the indoor environment. Comfort Performances quantify the comfort at each stage of the building life-cycle using standard comfort metrics. Comfort Ratings - These are a means of classifying the comfort conditions of the indoor environment according to an appropriate standard. Comfort Ratings are generated by comparing different Comfort Performances. Comfort Ratings provide additional information relating to the comfort conditions of the indoor environment, which is not readily determined from the individual Comfort Performances. Comfort History – This is a continuous descriptive record of the comfort throughout the project, with a focus on documenting the items and activities, proposed and implemented, which could potentially affect comfort. Each aspect of the Comfort History is linked to the relevant comfort entity it references. These three components create a comprehensive record of the comfort throughout the building lifecycle. They are then stored and made available in a common format in a central location which allows them to be re-used ad infinitum. The LCMS System was developed to implement the ComMet methodology. It uses current and emerging technologies to capture, store and allow easy access to comfort knowledge as specified by ComMet. LCMS is an IT system that is a combination of the following six components: Building Standards; Modelling & Simulation; Physical Measurement through the specially developed Egg-Whisk (Wireless Sensor) Network; Data Manipulation; Information Recording; Knowledge Storage and Access.Results from a test case application of the LCMS system - an existing office room at a research facility - highlighted that while some aspects of comfort were being maintained, the building’s environment was not in compliance with the acceptable levels as stipulated by the relevant building standards. The implementation of ComMet, through LCMS, demonstrates how comfort, typically only considered during early design, can be measured and managed appropriately through systematic application of the methodology as means of ensuring a healthy internal environment in the building.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This portfolio consists of 15 original musical works. Taking the form of electronic and acousmatic music, multimedia, and scores, these chamber works serve as a result of experimentation and improvisation with individually built computer interfaces. The accompanying commentary provides discourse on the conceptual practice of these interfaces becoming a compositional entity that present a multi-interpretative opportunity to explore, engage, and personalise. Following this, the commentary examines the path of creative decisions and musical choices that formed both these interfaces and the resulting musical and visual works. This portfolio is accompanied by interfaces used, transcoded interfacing behavioural information, and documented improvisational findings.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Open environments involve distributed entities interacting with each other in an open manner. Many distributed entities are unknown to each other but need to collaborate and share resources in a secure fashion. Usually resource owners alone decide who is trusted to access their resources. Since resource owners in open environments do not have a complete picture of all trusted entities, trust management frameworks are used to ensure that only authorized entities will access requested resources. Every trust management system has limitations, and the limitations can be exploited by malicious entities. One vulnerability is due to the lack of globally unique interpretation for permission specifications. This limitation means that a malicious entity which receives a permission in one domain may misuse the permission in another domain via some deceptive but apparently authorized route; this malicious behaviour is called subterfuge. This thesis develops a secure approach, Subterfuge Safe Trust Management (SSTM), that prevents subterfuge by malicious entities. SSTM employs the Subterfuge Safe Authorization Language (SSAL) which uses the idea of a local permission with a globally unique interpretation (localPermission) to resolve the misinterpretation of permissions. We model and implement SSAL with an ontology-based approach, SSALO, which provides a generic representation for knowledge related to the SSAL-based security policy. SSALO enables integration of heterogeneous security policies which is useful for secure cooperation among principals in open environments where each principal may have a different security policy with different implementation. The other advantage of an ontology-based approach is the Open World Assumption, whereby reasoning over an existing security policy is easily extended to include further security policies that might be discovered in an open distributed environment. We add two extra SSAL rules to support dynamic coalition formation and secure cooperation among coalitions. Secure federation of cloud computing platforms and secure federation of XMPP servers are presented as case studies of SSTM. The results show that SSTM provides robust accountability for the use of permissions in federation. It is also shown that SSAL is a suitable policy language to express the subterfuge-safe policy statements due to its well-defined semantics, ease of use, and integrability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We report the case of a 49-year old woman with an idiopathic pulmonary fibrosis (IPF) initially diagnosed as a systemic lupus erythematosus. The IPF is an uncommon clinical entity with an estimated prevalence from 3 to 6 cases per 100,000 in the general population of the United States. This disease is characterised by an insidious onset, a pejorative course and poor survival prognosis (median survival: 2.8 years). The diagnosis is often difficult and depends on the exclusion of other diseases associated with interstitial lung injury. It is generally established only after collegial coordination between the clinician, the radiologist and the pathologist. New consensuses are now published to establish a clear and explicit classification of the IPF. Moreover, because of the poor results obtained with conventional immunosuppressive drugs, new treatments are proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To define the biology driving the aggressive nature of breast cancer arising in young women. EXPERIMENTAL DESIGN: Among 784 patients with early stage breast cancer, using prospectively-defined, age-specific cohorts (young or=65 years), 411 eligible patients (n = 200or=65 years) with clinically-annotated Affymetrix microarray data were identified. GSEA, signatures of oncogenic pathway deregulation and predictors of chemotherapy sensitivity were evaluated within the two age-defined cohorts. RESULTS: In comparing deregulation of oncogenic pathways between age groups, a higher probability of PI3K (p = 0.006) and Myc (p = 0.03) pathway deregulation was observed in breast tumors arising in younger women. When evaluating unique patterns of pathway deregulation, a low probability of Src and E2F deregulation in tumors of younger women, concurrent with a higher probability of PI3K, Myc, and beta-catenin, conferred a worse prognosis (HR = 4.15). In contrast, a higher probability of Src and E2F pathway activation in tumors of older women, with concurrent low probability of PI3K, Myc and beta-catenin deregulation, was associated with poorer outcome (HR = 2.7). In multivariate analyses, genomic clusters of pathway deregulation illustrate prognostic value. CONCLUSION: Results demonstrate that breast cancer arising in young women represents a distinct biologic entity characterized by unique patterns of deregulated signaling pathways that are prognostic, independent of currently available clinico-pathologic variables. These results should enable refinement of targeted treatment strategies in this clinically challenging situation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our ability to track an object as the same persisting entity over time and motion may primarily rely on spatiotemporal representations which encode some, but not all, of an object's features. Previous researchers using the 'object reviewing' paradigm have demonstrated that such representations can store featural information of well-learned stimuli such as letters and words at a highly abstract level. However, it is unknown whether these representations can also store purely episodic information (i.e. information obtained from a single, novel encounter) that does not correspond to pre-existing type-representations in long-term memory. Here, in an object-reviewing experiment with novel face images as stimuli, observers still produced reliable object-specific preview benefits in dynamic displays: a preview of a novel face on a specific object speeded the recognition of that particular face at a later point when it appeared again on the same object compared to when it reappeared on a different object (beyond display-wide priming), even when all objects moved to new positions in the intervening delay. This case study demonstrates that the mid-level visual representations which keep track of persisting identity over time--e.g. 'object files', in one popular framework can store not only abstract types from long-term memory, but also specific tokens from online visual experience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Phenotypic differences among species have long been systematically itemized and described by biologists in the process of investigating phylogenetic relationships and trait evolution. Traditionally, these descriptions have been expressed in natural language within the context of individual journal publications or monographs. As such, this rich store of phenotype data has been largely unavailable for statistical and computational comparisons across studies or integration with other biological knowledge. METHODOLOGY/PRINCIPAL FINDINGS: Here we describe Phenex, a platform-independent desktop application designed to facilitate efficient and consistent annotation of phenotypic similarities and differences using Entity-Quality syntax, drawing on terms from community ontologies for anatomical entities, phenotypic qualities, and taxonomic names. Phenex can be configured to load only those ontologies pertinent to a taxonomic group of interest. The graphical user interface was optimized for evolutionary biologists accustomed to working with lists of taxa, characters, character states, and character-by-taxon matrices. CONCLUSIONS/SIGNIFICANCE: Annotation of phenotypic data using ontologies and globally unique taxonomic identifiers will allow biologists to integrate phenotypic data from different organisms and studies, leveraging decades of work in systematics and comparative morphology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: The wealth of phenotypic descriptions documented in the published articles, monographs, and dissertations of phylogenetic systematics is traditionally reported in a free-text format, and it is therefore largely inaccessible for linkage to biological databases for genetics, development, and phenotypes, and difficult to manage for large-scale integrative work. The Phenoscape project aims to represent these complex and detailed descriptions with rich and formal semantics that are amenable to computation and integration with phenotype data from other fields of biology. This entails reconceptualizing the traditional free-text characters into the computable Entity-Quality (EQ) formalism using ontologies. METHODOLOGY/PRINCIPAL FINDINGS: We used ontologies and the EQ formalism to curate a collection of 47 phylogenetic studies on ostariophysan fishes (including catfishes, characins, minnows, knifefishes) and their relatives with the goal of integrating these complex phenotype descriptions with information from an existing model organism database (zebrafish, http://zfin.org). We developed a curation workflow for the collection of character, taxonomic and specimen data from these publications. A total of 4,617 phenotypic characters (10,512 states) for 3,449 taxa, primarily species, were curated into EQ formalism (for a total of 12,861 EQ statements) using anatomical and taxonomic terms from teleost-specific ontologies (Teleost Anatomy Ontology and Teleost Taxonomy Ontology) in combination with terms from a quality ontology (Phenotype and Trait Ontology). Standards and guidelines for consistently and accurately representing phenotypes were developed in response to the challenges that were evident from two annotation experiments and from feedback from curators. CONCLUSIONS/SIGNIFICANCE: The challenges we encountered and many of the curation standards and methods for improving consistency that we developed are generally applicable to any effort to represent phenotypes using ontologies. This is because an ontological representation of the detailed variations in phenotype, whether between mutant or wildtype, among individual humans, or across the diversity of species, requires a process by which a precise combination of terms from domain ontologies are selected and organized according to logical relations. The efficiencies that we have developed in this process will be useful for any attempt to annotate complex phenotypic descriptions using ontologies. We also discuss some ramifications of EQ representation for the domain of systematics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much of science progresses within the tight boundaries of what is often seen as a "black box". Though familiar to funding agencies, researchers and the academic journals they publish in, it is an entity that outsiders rarely get to peek into. Crowdfunding is a novel means that allows the public to participate in, as well as to support and witness advancements in science. Here we describe our recent crowdfunding efforts to sequence the Azolla genome, a little fern with massive green potential. Crowdfunding is a worthy platform not only for obtaining seed money for exploratory research, but also for engaging directly with the general public as a rewarding form of outreach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract: New product design challenges, related to customer needs, product usage and environments, face companies when they expand their product offerings to new markets; Some of the main challenges are: the lack of quantifiable information, product experience and field data. Designing reliable products under such challenges requires flexible reliability assessment processes that can capture the variables and parameters affecting the product overall reliability and allow different design scenarios to be assessed. These challenges also suggest a mechanistic (Physics of Failure-PoF) reliability approach would be a suitable framework to be used for reliability assessment. Mechanistic Reliability recognizes the primary factors affecting design reliability. This research views the designed entity as a “system of components required to deliver specific operations”; it addresses the above mentioned challenges by; Firstly: developing a design synthesis that allows a descriptive operations/ system components relationships to be realized; Secondly: developing component’s mathematical damage models that evaluate components Time to Failure (TTF) distributions given: 1) the descriptive design model, 2) customer usage knowledge and 3) design material properties; Lastly: developing a procedure that integrates components’ damage models to assess the mechanical system’s reliability over time. Analytical and numerical simulation models were developed to capture the relationships between operations and components, the mathematical damage models and the assessment of system’s reliability. The process was able to affect the design form during the conceptual design phase by providing stress goals to meet component’s reliability target. The process was able to numerically assess the reliability of a system based on component’s mechanistic TTF distributions, besides affecting the design of the component during the design embodiment phase. The process was used to assess the reliability of an internal combustion engine manifold during design phase; results were compared to reliability field data and found to produce conservative reliability results. The research focused on mechanical systems, affected by independent mechanical failure mechanisms that are influenced by the design process. Assembly and manufacturing stresses and defects’ influences are not a focus of this research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Paper presented at the Cloud Forward Conference 2015, October 6th-8th, Pisa