977 resultados para approximate KNN query
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.
Resumo:
The Phase I research, Iowa Department of Transportation (IDOT) Project HR-214, "Feasibility Study of Strengthening Existing Single Span Steel Beam Concrete Deck Bridges," verified that post-tensioning can be used to provide strengthening of the composite bridges under investigation. Phase II research, reported here, involved the strengthening of two full-scale prototype bridges - one a prototype of the model bridge tested during Phase I and the other larger and skewed. In addition to the field work, Phase II also involved a considerable amount of laboratory work. A literature search revealed that only minimal data existed on the angle-plus-bar shear connectors. Thus, several specimens utilizing angle-plus-bar, as well as channels, studs and high strength bolts as shear connectors were fabricated and tested. To obtain additional shear connector information, the bridge model of Phase I was sawed into four composite concrete slab and steel beam specimens. Two of the resulting specimens were tested with the original shear connection, while the other two specimens had additional shear connectors added before testing. Although orthotropic plate theory was shown in Phase I to predict vertical load distribution in bridge decks and to predict approximate distribution of post-tensioning for right-angle bridges, it was questioned whether the theory could also be used on skewed bridges. Thus, a small plexiglas model was constructed and used in vertical load distribution tests and post-tensioning force distribution tests for verification of the theory. Conclusions of this research are as follows: (1) The capacity of existing shear connectors must be checked as part of a bridge strengthening program. Determination of the concrete deck strength in advance of bridge strengthening is also recommended. (2) The ultimate capacity of angle-plus-bar shear connectors can be computed on the basis of a modified AASHTO channel connector formula and an angle-to-beam weld capacity check. (3) Existing shear connector capacity can be augmented by means of double-nut high strength bolt connectors. (4) Post-tensioning did not significantly affect truck load distribution for right angle or skewed bridges. (5) Approximate post-tensioning and truck load distribution for actual bridges can be predicted by orthotropic plate theory for vertical load; however, the agreement between actual distribution and theoretical distribution is not as close as that measured for the laboratory model in Phase I. (6) The right angle bridge exhibited considerable end restraint at what would be assumed to be simple support. The construction details at bridge abutments seem to be the reason for the restraint. (7) The skewed bridge exhibited more end restraint than the right angle bridge. Both skew effects and construction details at the abutments accounted for the restraint. (8) End restraint in the right angle and skewed bridges reduced tension strains in the steel bridge beams due to truck loading, but also reduced the compression strains caused by post-tensioning.
Resumo:
The purpose of this research is to assess the vulnerabilities of a high resolution fingerprint sensor when confronted with fake fingerprints. The study has not been focused on the decision outcome of the biometric device, but essentially on the scores obtained following the comparison between a query (genuine or fake) and a template using an AFIS system. To do this, fake fingerprints of 12 subjects have been produced with and without their cooperation. These fake fingerprints have been used alongside with real fingers. The study led to three major observations: First, genuine fingerprints produced scores higher than fake fingers (translating a closer proximity) and this tendency is observed considering each subject separately. Second, scores are however not sufficient as a single measure to differentiate these samples (fake from genuine) given the variation due to the donors themselves. That explains why fingerprint readers without vitality detection can be fooled. Third, production methods and subjects greatly influence the scores obtained for fake fingerprints.
Resumo:
Radioiodinated recombinant human interferon-gamma (IFN gamma) bound to human monocytes, U937, and HL60 cells in a specific, saturable, and reversible manner. At 4 degrees C, the different cell types bound 3,000-7,000 molecules of IFN gamma, and binding was of comparable affinity (Ka = 4-12 X 10(8) M-1). No change in the receptor was observed after monocytes differentiated to macrophages or when the cell lines were pharmacologically induced to differentiate. The functional relevance of the receptor was validated by the demonstration that receptor occupancy correlated with induction of Fc receptors on U937. Binding studies using U937 permeabilized with digitonin showed that only 46% of the total receptor pool was expressed at the cell surface. The receptor appears to be a protein, since treatment of U937 with trypsin or pronase reduced 125I-IFN gamma binding by 87 and 95%, respectively. At 37 degrees C, ligand was internalized, since 32% of the cell-associated IFN gamma became resistant to trypsin stripping. Monocytes degraded 125I-IFN gamma into trichloroacetic acid-soluble counts at 37 degrees C but not at 4 degrees C, at an approximate rate of 5,000 molecules/cell per h. The receptor was partially characterized by SDS-polyacrylamide gel electrophoresis analysis of purified U937 membranes that had been incubated with 125I-IFN gamma. After cross-linking, the receptor-ligand complex migrated as a broad band that displayed an Mr of 104,000 +/- 18,000 at the top and 84,000 +/- 6,000 at the bottom. These results thereby define and partially characterize the IFN gamma receptor of human mononuclear phagocytes.
Resumo:
The multiscale finite-volume (MSFV) method has been derived to efficiently solve large problems with spatially varying coefficients. The fine-scale problem is subdivided into local problems that can be solved separately and are coupled by a global problem. This algorithm, in consequence, shares some characteristics with two-level domain decomposition (DD) methods. However, the MSFV algorithm is different in that it incorporates a flux reconstruction step, which delivers a fine-scale mass conservative flux field without the need for iterating. This is achieved by the use of two overlapping coarse grids. The recently introduced correction function allows for a consistent handling of source terms, which makes the MSFV method a flexible algorithm that is applicable to a wide spectrum of problems. It is demonstrated that the MSFV operator, used to compute an approximate pressure solution, can be equivalently constructed by writing the Schur complement with a tangential approximation of a single-cell overlapping grid and incorporation of appropriate coarse-scale mass-balance equations.
Resumo:
The results of Ar-40/Ar-39 dating integrated with calcareous plankton biostratigraphical data performed on two volcaniclastic layers (VLs) interbedded in Burdigalian to Lower Langhian outer shelf carbonate sediments cropping out in Monferrato (NW Italy) are presented. The investigated VLs, named Villadeati and Varengo, are thick sedimentary bodies with scarce lateral continuity. They are composed of prevalent volcanogenic material (about 87 up to 90% by volume) consisting of glass shards and volcanic phenocrysts (plagioclase, biotite, quartz, amphibole, sanidine and magnetite) and minor extrabasinal and intrabasinal components. On the basis of their composition and sedimentological features, the VLs have been interpreted as distal shelf turbidites deposited below storm wave base. However, compositional characteristics evidence the rapid resedimentation of the volcanic detritus after its primary deposition and hence the VL sediments can be considered penecontemporaneous to the encasing deposits. Biostratigraphical analyses were carried out on the basis of a quantitative study of calcareous nannofossil and planktonic foraminifer associations, whilst Ar-40/Ar-39 dating were performed on biotite at Villadeati and on homeblende at Varengo. The data resulting from the Villadeati section have permitted to estimate an age of 18.7 +/- 0.1 Ma for the last common occurrence (LCO) of Sphenolithus belemnos whereas those from Varengo allowed to extrapolate an age of 16.4 Ma +/-0.1 Ma for the first occurrence (FO) of Praeorbulina sicana. This latter biovent is commonly used to approximate the base of the Langhian stage, that corresponds to the Early-Middle Miocene boundary.
Resumo:
OBJECTIVES: To evaluate the combination of ultrasound (US) + fine-needle aspiration (FNA) in the assessment of salivary gland tumours in the hands of the otolaryngologist. DESIGN: A retrospective review of case notes was performed. SETTING: Two university teaching hospitals in Switzerland. PARTICIPANTS: One hundred and three patients with a total of 106 focal masses of the salivary glands were included. Clinician-operated US + FNA were the first line of investigation for these lesions. All patients underwent surgical excision of the lesion, which allowed for confirmation of diagnosis by histopathology in 104 lesions and by laboratory testing in two lesions. MAIN OUTCOME MEASURES: Primary--diagnostic accuracy in identifying true salivary gland neoplasms and detecting malignancy. Secondary--predicting an approximate and specific diagnosis in these tumours. RESULTS: The combination of US + FNA achieved a diagnostic accuracy of 99% in identifying and differentiating true salivary gland neoplasms from tumour-like lesions. In detecting malignancy, this combination permitted an accuracy of 98%. An approximate diagnosis was possible in 89%, and a specific diagnosis in 69% of our patients. CONCLUSIONS: Due to economic factors and a high diagnostic accuracy, the combination of US + FNA represents the investigation method of choice for most salivary gland tumours. We suggest that the otolaryngologist be employed in carrying out these procedures, as is already the rule in other medical specialties, while computed tomography and magnetic resonance imaging should be reserved to those few lesions, which cannot be delineated completely by sonography.
Resumo:
Normal and abnormal brains can be segmented by registering the target image with an atlas. Here, an atlas is defined as the combination of an intensity image (template) and its segmented image (the atlas labels). After registering the atlas template and the target image, the atlas labels are propagated to the target image. We define this process as atlas-based segmentation. In recent years, researchers have investigated registration algorithms to match atlases to query subjects and also strategies for atlas construction. In this paper we present a review of the automated approaches for atlas-based segmentation of magnetic resonance brain images. We aim to point out the strengths and weaknesses of atlas-based methods and suggest new research directions. We use two different criteria to present the methods. First, we refer to the algorithms according to their atlas-based strategy: label propagation, multi-atlas methods, and probabilistic techniques. Subsequently, we classify the methods according to their medical target: the brain and its internal structures, tissue segmentation in healthy subjects, tissue segmentation in fetus, neonates and elderly subjects, and segmentation of damaged brains. A quantitative comparison of the results reported in the literature is also presented.
Resumo:
La pregunta inicial d’aquesta investigació rau en si el patrimoni proper és un recurs que facilita l’aprenentatge de les ciències socials als nens i nenes. Per tal de donar resposta a aquesta qüestió, s’ha estudiat la relació entre el patrimoni local i l’interès que genera. Per realitzar-ho, es van concretar uns ítems que han estat analitzats a partir d’un estudi de casos. Els participants d’aquesta investigació han estat un grup d’alumnes d’Educació Primària, els quals van realitzar una proposta didàctica on s’utilitzava el patrimoni proper com a eix vertebrador. També, han participat en aquesta investigació mestres d’escoles de Sant Celoni, els quals han expressat les vivències viscudes amb els seus alumnes en activitats que impliquen el patrimoni local. Finalment, s’ha analitzat la informació aportada per tots els participants i s’ha arribat a unes conclusions.
Resumo:
The development of shear instabilities of a wave-driven alongshore current is investigated. In particular, we use weakly nonlinear theory to investigate the possibility that such instabilities, which have been observed at various sites on the U.S. coast and in the laboratory, can grow in linearly stable flows as a subcritical bifurcation by resonant triad interaction, as first suggested by Shrira eta/. [1997]. We examine a realistic longshore current profile and include the effects of eddy viscosity and bottom friction. We show that according to the weakly nonlinear theory, resonance is possible and that these linearly stable flows may exhibit explosive instabilities. We show that this phenomenon may occur also when there is only approximate resonance, which is more likely in nature. Furthermore, the size of the perturbation that is required to trigger the instability is shown in some circumstances to be consistent with the size of naturally occurring perturbations. Finally, we consider the differences between the present case examined and the more idealized case of Shrira et a/. [ 1997]. It is shown that there is a possibility of coupling between triads, due to the richer modal structure in more realistic flows, which may act to stabilize the flow and act against the development of subcritical bifurcations. Extensive numerical tests are called for.
Resumo:
The main goal of CleanEx is to provide access to public gene expression data via unique gene names. A second objective is to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and cross-data set comparisons. A consistent and up-to-date gene nomenclature is achieved by associating each single experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of human genes and genes from model organisms. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing cross-references to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resource, such as cDNA clones or Affymetrix probe sets. The web-based query interfaces offer access to individual entries via text string searches or quantitative expression criteria. CleanEx is accessible at: http://www.cleanex.isb-sib.ch/.
Resumo:
This paper highlights the role of non-functional information when reusing from a component library. We describe a method for selecting appropriate implementations of Ada packages taking non-functional constraints into account; these constraints model the context of reuse. Constraints take the form of queries using an interface description language called NoFun, which is also used to state non-functional information in Ada packages; query results are trees of implementations, following the import relationships between components. We define two different situations when reusing components, depending whether we take the library being searched as closed or extendible. The resulting tree of implementations can be manipulated by the user to solve ambiguities, to state default behaviours, and by the like. As part of the proposal, we face the problem of computing from code the non-functional information that determines the selection process.
Resumo:
When individuals learn by trial-and-error, they perform randomly chosen actions and then reinforce those actions that led to a high payoff. However, individuals do not always have to physically perform an action in order to evaluate its consequences. Rather, they may be able to mentally simulate actions and their consequences without actually performing them. Such fictitious learners can select actions with high payoffs without making long chains of trial-and-error learning. Here, we analyze the evolution of an n-dimensional cultural trait (or artifact) by learning, in a payoff landscape with a single optimum. We derive the stochastic learning dynamics of the distance to the optimum in trait space when choice between alternative artifacts follows the standard logit choice rule. We show that for both trial-and-error and fictitious learners, the learning dynamics stabilize at an approximate distance of root n/(2 lambda(e)) away from the optimum, where lambda(e) is an effective learning performance parameter depending on the learning rule under scrutiny. Individual learners are thus unlikely to reach the optimum when traits are complex (n large), and so face a barrier to further improvement of the artifact. We show, however, that this barrier can be significantly reduced in a large population of learners performing payoff-biased social learning, in which case lambda(e) becomes proportional to population size. Overall, our results illustrate the effects of errors in learning, levels of cognition, and population size for the evolution of complex cultural traits. (C) 2013 Elsevier Inc. All rights reserved.