901 resultados para Page name matching
Resumo:
The evolution of pharmaceutical competition since Congress passed the Hatch-Waxman Act in 1984 raises questions about whether the act's intended balance of incentives for cost savings and continued innovation has been achieved. Generic drug usage and challenges to brand-name drugs' patents have increased markedly, resulting in greatly increased cost savings but also potentially reduced incentives for innovators. Congress should review whether Hatch-Waxman is achieving its intended purpose of balancing incentives for generics and innovation. It also should consider whether the law should be amended so that some of its provisions are brought more in line with recently enacted legislation governing approval of so-called biosimilars, or the corollary for biologics of generic competition for small-molecule drugs.
Resumo:
Telecentric optical computed tomography (optical-CT) is a state-of-the-art method for visualizing and quantifying 3-dimensional dose distributions in radiochromic dosimeters. In this work a prototype telecentric system (DFOS-Duke Fresnel Optical-CT Scanner) is evaluated which incorporates two substantial design changes: the use of Fresnel lenses (reducing lens costs from $10-30K t0 $1-3K) and the use of a 'solid tank' (which reduces noise, and the volume of refractively matched fluid from 1 ltr to 10 cc). The efficacy of DFOS was evaluated by direct comparison against commissioned scanners in our lab. Measured dose distributions from all systems were compared against the predicted dose distributions from a commissioned treatment planning system (TPS). Three treatment plans were investigated including a simple four-field box treatment, a multiple small field delivery, and a complex IMRT treatment. Dosimeters were imaged within 2 h post irradiation, using consistent scanning techniques (360 projections acquired at 1 degree intervals, reconstruction at 2mm). DFOS efficacy was evaluated through inspection of dose line-profiles, and 2D and 3D dose and gamma maps. DFOS/TPS gamma pass rates with 3%/3mm dose difference/distance-to-agreement criteria ranged from 89.3% to 92.2%, compared to from 95.6% to 99.0% obtained with the commissioned system. The 3D gamma pass rate between the commissioned system and DFOS was 98.2%. The typical noise rates in DFOS reconstructions were up to 3%, compared to under 2% for the commissioned system. In conclusion, while the introduction of a solid tank proved advantageous with regards to cost and convenience, further work is required to improve the image quality and dose reconstruction accuracy of the new DFOS optical-CT system.
Resumo:
Fractal video compression is a relatively new video compression method. Its attraction is due to the high compression ratio and the simple decompression algorithm. But its computational complexity is high and as a result parallel algorithms on high performance machines become one way out. In this study we partition the matching search, which occupies the majority of the work in a fractal video compression process, into small tasks and implement them in two distributed computing environments, one using DCOM and the other using .NET Remoting technology, based on a local area network consists of loosely coupled PCs. Experimental results show that the parallel algorithm is able to achieve a high speedup in these distributed environments.
Resumo:
This paper introduces a mechanism for representing and recognizing case history patterns with rich internal temporal aspects. A case history is characterized as a collection of elemental cases as in conventional case-based reasoning systems, together with the corresponding temporal constraints that can be relative and/or with absolute values. A graphical representation for case histories is proposed as a directed, partially weighted and labeled simple graph. In terms of such a graphical representation, an eigen-decomposition graph matching algorithm is proposed for recognizing case history patterns.
Resumo:
In this paper, we shall critically examine a special class of graph matching algorithms that follow the approach of node-similarity measurement. A high-level algorithm framework, namely node-similarity graph matching framework (NSGM framework), is proposed, from which, many existing graph matching algorithms can be subsumed, including the eigen-decomposition method of Umeyama, the polynomial-transformation method of Almohamad, the hubs and authorities method of Kleinberg, and the kronecker product successive projection methods of Wyk, etc. In addition, improved algorithms can be developed from the NSGM framework with respects to the corresponding results in graph theory. As the observation, it is pointed out that, in general, any algorithm which can be subsumed from NSGM framework fails to work well for graphs with non-trivial auto-isomorphism structure.
Resumo:
This paper examines different ways of measuring similarity between software design models for Case Based Reasoning (CBR) to facilitate reuse of software design and code. The paper considers structural and behavioural aspects of similarity between software design models. Similarity metrics for comparing static class structures are defined and discussed. A Graph representation of UML class diagrams and corresponding similarity measures for UML class diagrams are defined. A full search graph matching algorithm for measuring structural similarity diagrams based on the identification of the Maximum Common Sub-graph (MCS) is presented. Finally, a simple evaluation of the approach is presented and discussed.
Resumo:
In an analysis of President Obama's Acceptance Speech, this article argues that postcolonial theory is now being re-formulated for a global, transnational sensibility.
Resumo:
In terms of a general time theory which addresses time-elements as typed point-based intervals, a formal characterization of time-series and state-sequences is introduced. Based on this framework, the subsequence matching problem is specially tackled by means of being transferred into bipartite graph matching problem. Then a hybrid similarity model with high tolerance of inversion, crossover and noise is proposed for matching the corresponding bipartite graphs involving both temporal and non-temporal measurements. Experimental results on reconstructed time-series data from UCI KDD Archive demonstrate that such an approach is more effective comparing with the traditional similarity model based algorithms, promising robust techniques for lager time-series databases and real-life applications such as Content-based Video Retrieval (CBVR), etc.
Resumo:
Induced by a literature review, this paper presents a framework of dimensions and indicators highlighting the underpinning aspects and values of social learning within teacher groups. Notions of social networks, communities of practice and learning teams were taken as the main perspectives to influence this social learning framework. The review exercise resulted in four dimensions: (1) practice, (2) domain and value creation, (3) collective identity and (4) organization. The indicators corresponding to these dimensions serve as the foundation for understanding social learning in practice. The framework of dimensions and indicators can be of assistance for researchers as well as teacher groups that aim to assess their views on social learning and analyse whether these views fit the learning goals of the group, or that adjustments are required. In this way, learning processes within groups of teachers can be improved.
Resumo:
We examined the taxonomic resolution of zooplankton data required to identify ocean basin scale biogeographic zonation in the Southern Ocean. A 2,154 km transect was completed south of Australia. Sea surface temperature (SST) measured at 1 min intervals showed that seven physical zones were sampled. Zooplankton were collected at a spatial resolution of similar to 9.2 km with a continuous plankton recorder, identified to the highest possible taxonomic resolution and enumerated. Zooplankton assemblage similarity between samples was calculated using the Bray-Curtis index for the taxonomic levels of species, genus, family, order and class after first log(10)(x + 1) (LA) and then presence/absence (PA) transformation of abundance data. Although within and between zone sample similarity increased with decreasing taxonomic resolution, for both data transformations, cluster analysis demonstrated that the biogeographic separation of zones remained at all taxonomic levels when using LA data. ANOSIM confirmed this, detecting significant differences in zooplankton assemblage structure between all seven a priori determined physical zones for all taxonomic levels when using the LA data. In the case of the PA data for the complete data set, and both LA and PA data for a crustacean only data set, no significant differences were detected between zooplankton assemblages in the Polar frontal zone (PFZ) and inter-PFZ at any taxonomic level. Loss of information at resolutions below the species level, particularly in the PA data, prevented the separation of some zones. However, the majority of physical zones were biogeographically distinct from species level to class using both LA and PA transformations. Significant relationships between SST and zooplankton community structure, summarised as NMDS scores, at all taxonomic levels, for both LA and PA transformations, and complete and crustacean only data sets, highlighted the biogeographic relevance of low resolution taxonomic data. The retention of biogeographic information in low taxonomic resolution data shows that data sets collected with different taxonomic resolutions may be meaningfully merged for the post hoc generation of Southern Ocean time series.
Resumo:
About Aeonio-Euphorbion canariensis Sundíng 1972, correct alliance name against Kleinio-Euphorbion canariense Rivas Goday & Esteve 1965