866 resultados para Imaginary and Real
Resumo:
While ATM bandwidth-reservation techniques are able to offer the guarantees necessary for the delivery of real-time streams in many applications (e.g. live audio and video), they suffer from many disadvantages that make them inattractive (or impractical) for many others. These limitations coupled with the flexibility and popularity of TCP/IP as a best-effort transport protocol have prompted the network research community to propose and implement a number of techniques that adapt TCP/IP to the Available Bit Rate (ABR) and Unspecified Bit Rate (UBR) services in ATM network environments. This allows these environments to smoothly integrate (and make use of) currently available TCP-based applications and services without much (if any) modifications. However, recent studies have shown that TCP/IP, when implemented over ATM networks, is susceptible to serious performance limitations. In a recently completed study, we have unveiled a new transport protocol, TCP Boston, that turns ATM's 53-byte cell-oriented switching architecture into an advantage for TCP/IP. In this paper, we demonstrate the real-time features of TCP Boston that allow communication bandwidth to be traded off for timeliness. We start with an overview of the protocol. Next, we analytically characterize the dynamic redundancy control features of TCP Boston. Next, We present detailed simulation results that show the superiority of our protocol when compared to other adaptations of TCP/IP over ATMs. In particular, we show that TCP Boston improves TCP/IP's performance over ATMs for both network-centric metrics (e.g., effective throughput and percent of missed deadlines) and real-time application-centric metrics (e.g., response time and jitter).
Resumo:
A fundamental task of vision systems is to infer the state of the world given some form of visual observations. From a computational perspective, this often involves facing an ill-posed problem; e.g., information is lost via projection of the 3D world into a 2D image. Solution of an ill-posed problem requires additional information, usually provided as a model of the underlying process. It is important that the model be both computationally feasible as well as theoretically well-founded. In this thesis, a probabilistic, nonlinear supervised computational learning model is proposed: the Specialized Mappings Architecture (SMA). The SMA framework is demonstrated in a computer vision system that can estimate the articulated pose parameters of a human body or human hands, given images obtained via one or more uncalibrated cameras. The SMA consists of several specialized forward mapping functions that are estimated automatically from training data, and a possibly known feedback function. Each specialized function maps certain domains of the input space (e.g., image features) onto the output space (e.g., articulated body parameters). A probabilistic model for the architecture is first formalized. Solutions to key algorithmic problems are then derived: simultaneous learning of the specialized domains along with the mapping functions, as well as performing inference given inputs and a feedback function. The SMA employs a variant of the Expectation-Maximization algorithm and approximate inference. The approach allows the use of alternative conditional independence assumptions for learning and inference, which are derived from a forward model and a feedback model. Experimental validation of the proposed approach is conducted in the task of estimating articulated body pose from image silhouettes. Accuracy and stability of the SMA framework is tested using artificial data sets, as well as synthetic and real video sequences of human bodies and hands.
Resumo:
Scene flow methods estimate the three-dimensional motion field for points in the world, using multi-camera video data. Such methods combine multi-view reconstruction with motion estimation. This paper describes an alternative formulation for dense scene flow estimation that provides reliable results using only two cameras by fusing stereo and optical flow estimation into a single coherent framework. Internally, the proposed algorithm generates probability distributions for optical flow and disparity. Taking into account the uncertainty in the intermediate stages allows for more reliable estimation of the 3D scene flow than previous methods allow. To handle the aperture problems inherent in the estimation of optical flow and disparity, a multi-scale method along with a novel region-based technique is used within a regularized solution. This combined approach both preserves discontinuities and prevents over-regularization – two problems commonly associated with the basic multi-scale approaches. Experiments with synthetic and real test data demonstrate the strength of the proposed approach.
Resumo:
Localization is essential feature for many mobile wireless applications. Data collected from applications such as environmental monitoring, package tracking or position tracking has no meaning without knowing the location of this data. Other applications have location information as a building block for example, geographic routing protocols, data dissemination protocols and location-based services such as sensing coverage. Many of the techniques have the trade-off among many features such as deployment of special hardware, level of accuracy and computation power. In this paper, we present an algorithm that extracts location constraints from the connectivity information. Our solution, which does not require any special hardware and a small number of landmark nodes, uses two types of location constraints. The spatial constraints derive the estimated locations observing which nodes are within communication range of each other. The temporal constraints refine the areas, computed by the spatial constraints, using properties of time and space extracted from a contact trace. The intuition of the temporal constraints is to limit the possible locations that a node can be using its previous and future locations. To quantify this intuitive improvement in refine the nodes estimated areas adding temporal information, we performed simulations using synthetic and real contact traces. The results show this improvement and also the difficulties of using real traces.
Resumo:
A novel method that combines shape-based object recognition and image segmentation is proposed for shape retrieval from images. Given a shape prior represented in a multi-scale curvature form, the proposed method identifies the target objects in images by grouping oversegmented image regions. The problem is formulated in a unified probabilistic framework and solved by a stochastic Markov Chain Monte Carlo (MCMC) mechanism. By this means, object segmentation and recognition are accomplished simultaneously. Within each sampling move during the simulation process,probabilistic region grouping operations are influenced by both the image information and the shape similarity constraint. The latter constraint is measured by a partial shape matching process. A generalized parallel algorithm by Barbu and Zhu,combined with a large sampling jump and other implementation improvements, greatly speeds up the overall stochastic process. The proposed method supports the segmentation and recognition of multiple occluded objects in images. Experimental results are provided for both synthetic and real images.
Resumo:
The effectiveness of service provisioning in largescale networks is highly dependent on the number and location of service facilities deployed at various hosts. The classical, centralized approach to determining the latter would amount to formulating and solving the uncapacitated k-median (UKM) problem (if the requested number of facilities is fixed), or the uncapacitated facility location (UFL) problem (if the number of facilities is also to be optimized). Clearly, such centralized approaches require knowledge of global topological and demand information, and thus do not scale and are not practical for large networks. The key question posed and answered in this paper is the following: "How can we determine in a distributed and scalable manner the number and location of service facilities?" We propose an innovative approach in which topology and demand information is limited to neighborhoods, or balls of small radius around selected facilities, whereas demand information is captured implicitly for the remaining (remote) clients outside these neighborhoods, by mapping them to clients on the edge of the neighborhood; the ball radius regulates the trade-off between scalability and performance. We develop a scalable, distributed approach that answers our key question through an iterative reoptimization of the location and the number of facilities within such balls. We show that even for small values of the radius (1 or 2), our distributed approach achieves performance under various synthetic and real Internet topologies that is comparable to that of optimal, centralized approaches requiring full topology and demand information.
Resumo:
Acute myeloid leukaemia refers to cancer of the blood and bone marrow characterised by the rapid expansion of immature blasts of the myeloid lineage. The aberrant proliferation of these blasts interferes with normal haematopoiesis, resulting in symptoms such as anaemia, poor coagulation and infections. The molecular mechanisms underpinning acute myeloid leukaemia are multi-faceted and complex, with a range of diverse genetic and cytogenetic abnormalities giving rise to the acute myeloid leukaemia phenotype. Amongst the most common causative factors are mutations of the FLT3 gene, which codes for a growth factor receptor tyrosine kinase required by developing haematopoietic cells. Disruptions to this gene can result in constitutively active FLT3, driving the de-regulated proliferation of undifferentiated precursor blasts. FLT3-targeted drugs provide the opportunity to inhibit this oncogenic receptor, but over time can give rise to resistance within the blast population. The identification of targetable components of the FLT3 signalling pathway may allow for combination therapies to be used to impede the emergence of resistance. However, the intracellular signal transduction pathway of FLT3 is relatively obscure. The objective of this study is to further elucidate this pathway, with particular focus on the redox signalling element which is thought to be involved. Signalling via reactive oxygen species is becoming increasingly recognised as a crucial aspect of physiological and pathological processes within the cell. The first part of this study examined the effects of NADPH oxidase-derived reactive oxygen species on the tyrosine phosphorylation levels of acute myeloid leukaemia cell lines. Using two-dimensional phosphotyrosine immunoblotting, a range of proteins were identified as undergoing tyrosine phosphorylation in response to NADPH oxidase activity. Ezrin, a cytoskeletal regulatory protein and substrate of Src kinase, was selected for further study. The next part of this study established that NADPH oxidase is subject to regulation by FLT3. Both wild type and oncogenic FLT3 signalling were shown to affect the expression of a key NADPH oxidase subunit, p22phox, and FLT3 was also demonstrated to drive intracellular reactive oxygen species production. The NADPH oxidase target protein, Ezrin, undergoes phosphorylation on two tyrosine residues downstream of FLT3 signalling, an effect which was shown to be p22phox-dependent and which was attributed to the redox regulation of Src. The cytoskeletal associations of Ezrin and its established role in metastasis prompted the investigation of the effects of FLT3 and NADPH oxidase activity on the migration of acute myeloid leukaemia cell lines. It was found that inhibition of either FLT3 or NADPH oxidase negatively impacted on the motility of acute myeloid leukaemia cells. The final part of this study focused on the relationship between FLT3 signalling and phosphatase activity. It was determined, using phosphatase expression profiling and real-time PCR, that several phosphatases are subject to regulation at the levels of transcription and post-translational modification downstream of oncogenic FLT3 activity. In summary, this study demonstrates that FLT3 signal transduction utilises a NADPH oxidase-dependent redox element, which affects Src kinase, and modulates leukaemic cell migration through Ezrin. Furthermore, the expression and activity of several phosphatases is tightly linked to FLT3 signalling. This work reveals novel components of the FLT3 signalling cascade and indicates a range of potential therapeutic targets.
Cost savings from relaxation of operational constraints on a power system with high wind penetration
Resumo:
Wind energy is predominantly a nonsynchronous generation source. Large-scale integration of wind generation with existing electricity systems, therefore, presents challenges in maintaining system frequency stability and local voltage stability. Transmission system operators have implemented system operational constraints (SOCs) in order to maintain stability with high wind generation, but imposition of these constraints results in higher operating costs. A mixed integer programming tool was used to simulate generator dispatch in order to assess the impact of various SOCs on generation costs. Interleaved day-ahead scheduling and real-time dispatch models were developed to allow accurate representation of forced outages and wind forecast errors, and were applied to the proposed Irish power system of 2020 with a wind penetration of 32%. Savings of at least 7.8% in generation costs and reductions in wind curtailment of 50% were identified when the most influential SOCs were relaxed. The results also illustrate the need to relax local SOCs together with the system-wide nonsynchronous penetration limit SOC, as savings from increasing the nonsynchronous limit beyond 70% were restricted without relaxation of local SOCs. The methodology and results allow for quantification of the costs of SOCs, allowing the optimal upgrade path for generation and transmission infrastructure to be determined.
Resumo:
Although many feature selection methods for classification have been developed, there is a need to identify genes in high-dimensional data with censored survival outcomes. Traditional methods for gene selection in classification problems have several drawbacks. First, the majority of the gene selection approaches for classification are single-gene based. Second, many of the gene selection procedures are not embedded within the algorithm itself. The technique of random forests has been found to perform well in high-dimensional data settings with survival outcomes. It also has an embedded feature to identify variables of importance. Therefore, it is an ideal candidate for gene selection in high-dimensional data with survival outcomes. In this paper, we develop a novel method based on the random forests to identify a set of prognostic genes. We compare our method with several machine learning methods and various node split criteria using several real data sets. Our method performed well in both simulations and real data analysis.Additionally, we have shown the advantages of our approach over single-gene-based approaches. Our method incorporates multivariate correlations in microarray data for survival outcomes. The described method allows us to better utilize the information available from microarray data with survival outcomes.
Resumo:
Mitochondria are responsible for producing the vast majority of cellular ATP, and are therefore critical to organismal health [1]. They contain thir own genomes (mtDNA) which encode 13 proteins that are all subunits of the mitochondrial respiratory chain (MRC) and are essential for oxidative phosphorylation [2]. mtDNA is present in multiple copies per cell, usually between 103 and 104 , though this number is reduced during certain developmental stages [3, 4]. The health of the mitochondrial genome is also important to the health of the organism, as mutations in mtDNA lead to human diseases that collectively affect approximately 1 in 4000 people [5, 6]. mtDNA is more susceptible than nuclear DNA (nucDNA) to damage by many environmental pollutants, for reasons including the absence of Nucleotide Excision Repair (NER) in the mitochondria [7]. NER is a highly functionally conserved DNA repair pathway that removes bulky, helix distorting lesions such as those caused by ultraviolet C (UVC) radiation and also many environmental toxicants, including benzo[a]pyrene (BaP) [8]. While these lesions cannot be repaired, they are slowly removed through a process that involves mitochondrial dynamics and autophagy [9, 10]. However, when present during development in C. elegans, this damage reduces mtDNA copy number and ATP levels [11]. We hypothesize that this damage, when present during development, will result in mitochondrial dysfunction and increase the potential for adverse outcomes later in life.
To test this hypothesis, 1st larval stage (L1) C. elegans are exposed to 3 doses of 7.5J/m2 ultraviolet C radiation 24 hours apart, leading to the accumulation of mtDNA damage [9, 11]. After exposure, many mitochondrial endpoints are assessed at multiple time points later in life. mtDNA and nucDNA damage levels and genome copy numbers are measured via QPCR and real-time PCR , respectively, every 2 day for 10 days. Steady state ATP levels are measured via luciferase expressing reporter strains and traditional ATP extraction methods. Oxygen consumption is measured using a Seahorse XFe24 extra cellular flux analyzer. Gene expression changes are measured via real time PCR and targeted metabolomics via LC-MS are used to investigate changes in organic acid, amino acid and acyl-carnitine levels. Lastly, nematode developmental delay is assessed as growth, and measured via imaging and COPAS biosort.
I have found that despite being removed, UVC induced mtDNA damage during development leads to persistent deficits in energy production later in life. mtDNA copy number is permanently reduced, as are ATP levels, though oxygen consumption is increased, indicating inefficient or uncoupled respiration. Metabolomic data and mutant sensitivity indicate a role for NADPH and oxidative stress in these results, and exposed nematodes are more sensitive to the mitochondrial poison rotenone later in life. These results fit with the developmental origin of health and disease hypothesis, and show the potential for environmental exposures to have lasting effects on mitochondrial function.
Lastly, we are currently working to investigate the potential for irreparable mtDNA lesions to drive mutagenesis in mtDNA. Mutations in mtDNA lead to a wide range of diseases, yet we currently do not understand the environmental component of what causes them. In vitro evidence suggests that UVC induced thymine dimers can be mutagenic [12]. We are using duplex sequencing of C. elegans mtDNA to determine mutation rates in nematodes exposed to our serial UVC protocol. Furthermore, by including mutant strains deficient in mitochondrial fission and mitophagy, we hope to determine if deficiencies in these processes will further increase mtDNA mutation rates, as they are implicated in human diseases.
Resumo:
This paper describes work towards the deployment of flexible self-management into real-time embedded systems. A challenging project which focuses specifically on the development of a dynamic, adaptive automotive middleware is described, and the specific self-management requirements of this project are discussed. These requirements have been identified through the refinement of a wide-ranging set of use cases requiring context-sensitive behaviours. A sample of these use-cases is presented to illustrate the extent of the demands for self-management. The strategy that has been adopted to achieve self-management, based on the use of policies is presented. The embedded and real-time nature of the target system brings the constraints that dynamic adaptation capabilities must not require changes to the run-time code (except during hot update of complete binary modules), adaptation decisions must have low latency, and because the target platforms are resource-constrained the self-management mechanism have low resource requirements (especially in terms of processing and memory). Policy-based computing is thus and ideal candidate for achieving the self-management because the policy itself is loaded at run-time and can be replaced or changed in the future in the same way that a data file is loaded. Policies represent a relatively low complexity and low risk means of achieving self-management, with low run-time costs. Policies can be stored internally in ROM (such as default policies) as well as externally to the system. The architecture of a designed-for-purpose powerful yet lightweight policy library is described. A suitable evaluation platform, supporting the whole life-cycle of feasibility analysis, concept evaluation, development, rigorous testing and behavioural validation has been devised and is described.
Resumo:
In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.
Resumo:
A video annotation system includes clips organization, feature description and pattern determination. This paper aims to present a system for basketball zone-defence detection. Particularly, a character-angle based descriptor for feature description is proposed. The well-performed experimental results in basketball zone-defence detection demonstrate that it is robust for both simulations and real-life cases, with less sensitivity to the distribution caused by local translation of subprime defenders. Such a framework can be easily applied to other team-work sports.
Resumo:
In Spain, during the recent housing bubble, purchasing a home seemed the most advantageous strategy to access housing, and there was a wide social consensus about the unavoidability of mortgage indebtedness. However, such consensus has been challenged by the financial and real-estate crisis. The victims of home repossessions have been affected by the transgression of several principles, such as the fair compensation for effort and sacrifice, the prioritisation of basic needs over financial commitments, the possibility of a second chance for over-indebted people, or the State's responsibility to guarantee its citizens' livelihood. Such principles may be understood as part of a moral economy, and their transgression has resulted in the emergence of a social movement, the Plataforma de Afectados por la Hipoteca (PAH), that is questioning the legitimacy of mortgage debts. The article reflects on the extent to which the perception of over-indebtedness and evictions as unfair situations can have an effect on the reproduction of the political-economic system, insofar the latter is perceived as able or unable to repair injustice.
Resumo:
BACKGROUND AND PURPOSE: To describe the clinical implementation of dynamic multileaf collimation (DMLC). Custom compensated four-field treatments of carcinoma of the bladder have been used as a simple test site for the introduction of intensity modulated radiotherapy.MATERIALS AND METHODS: Compensating intensity modulations are calculated from computed tomography (CT) data, accounting for scattered, as well as primary radiation. Modulations are converted to multileaf collimator (MLC) leaf and jaw settings for dynamic delivery on a linear accelerator. A full dose calculation is carried out, accounting for dynamic leaf and jaw motion and transmission through these components. Before treatment, a test run of the delivery is performed and an absolute dose measurement made in a water or solid water phantom. Treatments are verified by in vivo diode measurements and real-time electronic portal imaging. RESULTS: Seven patients have been treated using DMLC. The technique improves dose homogeneity within the target volume, reducing high dose areas and compensating for loss of scatter at the beam edge. A typical total treatment time is 20 min. CONCLUSIONS: Compensated bladder treatments have proven an effective test site for DMLC in an extremely busy clinic.