997 resultados para basis-set
Resumo:
We propose a compressive sensing algorithm that exploits geometric properties of images to recover images of high quality from few measurements. The image reconstruction is done by iterating the two following steps: 1) estimation of normal vectors of the image level curves, and 2) reconstruction of an image fitting the normal vectors, the compressed sensing measurements, and the sparsity constraint. The proposed technique can naturally extend to nonlocal operators and graphs to exploit the repetitive nature of textured images to recover fine detail structures. In both cases, the problem is reduced to a series of convex minimization problems that can be efficiently solved with a combination of variable splitting and augmented Lagrangian methods, leading to fast and easy-to-code algorithms. Extended experiments show a clear improvement over related state-of-the-art algorithms in the quality of the reconstructed images and the robustness of the proposed method to noise, different kind of images, and reduced measurements.
Resumo:
This final report establishes an evaluation system for the State of Iowa Merit Employment System classifications on the basis of comparable worth. Included in the report are summaries of the project's objectives, methods, analyses, findings, and recommendations.
Resumo:
This is the statistical supplement to the final report of the study to establish an evaluation system for State of Iowa Merit Employment System classifications on the basis of comparable worth.
Resumo:
The so-called "enchondromatoses" are skeletal disorders defined by the presence of ectopic cartilaginous tissue within bone tissue. The clinical and radiographic features of the different enchondromatoses are distinct, and grouping them does not reflect a common pathogenesis but simply a similar radiographic appearance and thus the need for a differential diagnosis. Recent advances in the understanding of their molecular and cellular bases confirm the heterogeneous nature of the different enchondromatoses. Some, like Ollier disease, Maffucci disease, metaphyseal chondromatosis with hydroxyglutaric aciduria, and metachondromatosis are produced by a dysregulation of chondrocyte proliferation, while others (such as spondyloenchondrodysplasia or dysspondyloenchondromatosis) are caused by defects in structure or metabolism of cartilage or bone matrix. In other forms (e.g., the dominantly inherited genochondromatoses), the basic defect remains to be determined. The classification, proposed by Spranger and associates in 1978 and tentatively revised twice, was based on the radiographic appearance, the anatomic sites involved, and the mode of inheritance. The new classification proposed here integrates the molecular genetic advances and delineates phenotypic families based on the molecular defects. Reference radiographs are provided to help in the diagnosis of the well-defined forms. In spite of advances, many cases remain difficult to diagnose and classify, implying that more variants remain to be defined at both the clinical and molecular levels. © 2012 Wiley Periodicals, Inc.
Resumo:
OBJECTIVE: To establish the genetic basis of Landau-Kleffner syndrome (LKS) in a cohort of two discordant monozygotic (MZ) twin pairs and 11 isolated cases. METHODS: We used a multifaceted approach to identify genetic risk factors for LKS. Array comparative genomic hybridization (CGH) was performed using the Agilent 180K array. Whole genome methylation profiling was undertaken in the two discordant twin pairs, three isolated LKS cases, and 12 control samples using the Illumina 27K array. Exome sequencing was undertaken in 13 patients with LKS including two sets of discordant MZ twins. Data were analyzed with respect to novel and rare variants, overlapping genes, variants in reported epilepsy genes, and pathway enrichment. RESULTS: A variant (cG1553A) was found in a single patient in the GRIN2A gene, causing an arginine to histidine change at site 518, a predicted glutamate binding site. Following copy number variation (CNV), methylation, and exome sequencing analysis, no single candidate gene was identified to cause LKS in the remaining cohort. However, a number of interesting additional candidate variants were identified including variants in RELN, BSN, EPHB2, and NID2. SIGNIFICANCE: A single mutation was identified in the GRIN2A gene. This study has identified a number of additional candidate genes including RELN, BSN, EPHB2, and NID2. A PowerPoint slide summarizing this article is available for download in the Supporting Information section here.
Resumo:
The major objective of this research project was to use thermal analysis techniques in conjunction with x-ray analysis methods to identify and explain chemical reactions that promote aggregate related deterioration in portland cement concrete. Twenty-two different carbonate aggregate samples were subjected to a chemical testing scheme that included: • bulk chemistry (major, minor and selected trace elements) • bulk mineralogy (minor phases concentrated by acid extraction) • solid-solution in the major carbonate phases • crystallite size determinations for the major carbonate phases • a salt treatment study to evaluate the impact of deicer salts Test results from these different studies were then compared to information that had been obtained using thermogravimetric analysis techniques. Since many of the limestones and dolomites that were used in the study had extensive field service records it was possible to correlate many of the variables with service life. The results of this study have indicated that thermogravimetric analysis can play an important role in categorizing carbonate aggregates. In fact, with modern automated thermal analysis systems it should be possible to utilize such methods on a quality control basis. Strong correlations were found between several of the variables that were monitored in this study. In fact, several of the variables exhibited significant correlations to concrete service life. When the full data set was utilized (n = 18), the significant correlations to service life can be summarized as follows ( a = 5% level): • Correlation coefficient, r, = -0.73 for premature TG loss versus service life. • Correlation coefficient, r, = 0.74 for relative crystallite size versus service life. • Correlation coefficient, r, = 0.53 for ASTM C666 durability factor versus service life. • Correlation coefficient, r, = -0.52 for acid-insoluble residue versus service life. Separation of the carbonate aggregates into their mineralogical categories (i.e., calcites and dolomites) tended to increase the correlation coefficients for some specific variables (r sometimes approached 0.90); however, the reliability of such correlations was questionable because of the small number of samples that were present in this study.
Resumo:
In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.
Resumo:
Currently, hydraulic cement grouts are approved for Iowa Department of Transportation projects on the basis of a pullout test. However, other properties of the grouts should be evaluated. Therefore, this research was initiated to develop criteria to better evaluate hydraulic cement grouts. Fourteen grouts were tested for compressive strength, time of set, durability, consistency and shrinkage. Tested grouts all yielded compressive strengths higher than 3000 psi at 7 days and durability factors were well above 70. Time of set and consistency was adequate. The testing showed most grouts tested shrank, even though tested grouts were labeled non-shrink grouts. For many applications of grouts such as setting in anchor bolts and as a filler, minor shrinkage is not a problem. However, for some critical applications, shrinkage cannot be tolerated. The proposed Instructional Memorandum will identify those grouts which do not excessively shrink or expand in the tests used. Based on test results, criteria for evaluation of hydraulic cement grouts have been recommended. Evaluation consists of tests for compressive strength, time of set, durability, consistency, shrinkage and pullout test.
Resumo:
Context: Ovarian tumors (OT) typing is a competency expected from pathologists, with significant clinical implications. OT however come in numerous different types, some rather rare, with the consequence of few opportunities for practice in some departments. Aim: Our aim was to design a tool for pathologists to train in less common OT typing. Method and Results: Representative slides of 20 less common OT were scanned (Nano Zoomer Digital Hamamatsu®) and the diagnostic algorithm proposed by Young and Scully applied to each case (Young RH and Scully RE, Seminars in Diagnostic Pathology 2001, 18: 161-235) to include: recognition of morphological pattern(s); shortlisting of differential diagnosis; proposition of relevant immunohistochemical markers. The next steps of this project will be: evaluation of the tool in several post-graduate training centers in Europe and Québec; improvement of its design based on evaluation results; diffusion to a larger public. Discussion: In clinical medicine, solving many cases is recognized as of utmost importance for a novice to become an expert. This project relies on the virtual slides technology to provide pathologists with a learning tool aimed at increasing their skills in OT typing. After due evaluation, this model might be extended to other uncommon tumors.
Resumo:
In contrast with the low frequency of most single epitope reactive T cells in the preimmune repertoire, up to 1 of 1,000 naive CD8(+) T cells from A2(+) individuals specifically bind fluorescent A2/peptide multimers incorporating the A27L analogue of the immunodominant 26-35 peptide from the melanocyte differentiation and melanoma associated antigen Melan-A. This represents the only naive antigen-specific T cell repertoire accessible to direct analysis in humans up to date. To get insight into the molecular basis for the selection and maintenance of such an abundant repertoire, we analyzed the functional diversity of T cells composing this repertoire ex vivo at the clonal level. Surprisingly, we found a significant proportion of multimer(+) clonotypes that failed to recognize both Melan-A analogue and parental peptides in a functional assay but efficiently recognized peptides from proteins of self- or pathogen origin selected for their potential functional cross-reactivity with Melan-A. Consistent with these data, multimers incorporating some of the most frequently recognized peptides specifically stained a proportion of naive CD8(+) T cells similar to that observed with Melan-A multimers. Altogether these results indicate that the high frequency of Melan-A multimer(+) T cells can be explained by the existence of largely cross-reactive subsets of naive CD8(+) T cells displaying multiple specificities.
Resumo:
A workshop recently held at the Ecole Polytechnique Federale de Lausanne (EPFL, Switzerland) was dedicated to understanding the genetic basis of adaptive change, taking stock of the different approaches developed in theoretical population genetics and landscape genomics and bringing together knowledge accumulated in both research fields. Indeed, an important challenge in theoretical population genetics is to incorporate effects of demographic history and population structure. But important design problems (e.g. focus on populations as units, focus on hard selective sweeps, no hypothesis-based framework in the design of the statistical tests) reduce their capability of detecting adaptive genetic variation. In parallel, landscape genomics offers a solution to several of these problems and provides a number of advantages (e.g. fast computation, landscape heterogeneity integration). But the approach makes several implicit assumptions that should be carefully considered (e.g. selection has had enough time to create a functional relationship between the allele distribution and the environmental variable, or this functional relationship is assumed to be constant). To address the respective strengths and weaknesses mentioned above, the workshop brought together a panel of experts from both disciplines to present their work and discuss the relevance of combining these approaches, possibly resulting in a joint software solution in the future.
Resumo:
Is it possible to perfectly simulate a signature, in the particular and challenging case where the signature is simple? A set of signatures of six writers, considered to be simple on the basis of highlighted criteria, was sampled. These signatures were transferred to forgers requested to produce freehand simulations. Among these simulations, those capable of reproducing the features of the reference signatures were submitted for evaluation to forensic document experts through proficiency testing. The results suggest that there is no perfect simulation. With the supplementary aim of assessing the influence of forger's skills on the results, forgers were selected from three distinct populations, which differ according to professional criteria. The results indicate some differences in graphical capabilities between individuals. However, no trend could be established regarding age, degrees, years of practice and time dedicated to the exercise. The findings show that simulation is made easier if a graphical compatibility exists between the forger's own writing and the signature to be reproduced. Moreover, a global difficulty to preserve proportions and slant as well as the shape of capital letters and initials has been noticed.
Resumo:
Purpose: Diagnostic radiology involving ionizing radiation often leads to crucial information but also involves risk. Estimated cancer risks associated with CT range between 1 in 1000 to 1 in 10 000, depending on age and exposure settings. The aim of this contribution is to provide radiologists a way to inform a patient about these risks on a collective and individual base. Materials and methods: After a brief review of the effects of ionizing radiations, conversion from dose indicators into effective dose will be presented for radiography, fluoroscopy and CT. The Diagnostic Reference Level (DRL) concept will be then introduced to enable the reader to compare the level of exposure of various examinations. Finally, the limit of effective dose will be explained and risk projections after various radiological procedures for adults and children will be presented. Results: From an individual standpoint the benefit of a well justified and optimized CT examination clearly outweigh its risk of inducing a fatal cancer. The uncertainties associated with the effective dose concept should be kept in mind in order to avoid cancer risk projections after an examination on an individual basis. Conclusion: Risk factors or effective dose are not the simplest tools to communicate when dealing with radiological risks. Thus, a set of categories should be preferred as proposed in the ICRP (International Commission on Radiation Protection) report 99.