196 resultados para DECOMPOSITIONS


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis analyzes the Chow motives of 3 types of smooth projective varieties: the desingularized elliptic self fiber product, the Fano surface of lines on a cubic threefold and an ample hypersurface of an Abelian variety. For the desingularized elliptic self fiber product, we use an isotypic decomposition of the motive to deduce the Murre conjectures. We also prove a result about the intersection product. For the Fano surface of lines, we prove the finite-dimensionality of the Chow motive. Finally, we prove that an ample hypersurface on an Abelian variety possesses a Chow-Kunneth decomposition for which a motivic version of the Lefschetz hyperplane theorem holds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Las brechas de desarrollo entre los niños de diversos fondos socioeconómicos emergen temprano y persisten en un cierto plazo. La formación cognoscitiva de la habilidad es un proceso acumulativo y, así, todas las influencias relevantes que ocurrieron hasta que se mide la habilidad del tiempo pueden desempeñar papel en formar estas brechas. Las descomposiciones lineares basadas en la técnica de la Oaxaca-Blinder son una manera bastante común de estimar la contribución de dos o más categorías de variables a estas diferencias en el logro cognoscitivo. Dos ejemplos prominentes de estas categorías son influencias de la familia y de la escuela. Al respeto, los objetos expuestos de la literatura no tienen ningún consenso en términos de estrategia de la descomposición y la interpretación de sus componentes, tan bien como una tendencia a separar influencias del hogar y de la escuela asignando todas las características observadas de la casa, familia y del niño a la primera categoría. Esto puedo conducir a las implicaciones engañosas de la política y a los diagonales en las contribuciones estimadas de las categorías. Este análisis intenta contribuir a la literatura de dos maneras. Primero, explora formalmente el potencial para los diagonales en los ejercicios de la descomposición procurados hasta ahora. En segundo lugar, ofrece una estrategia alternativa de la descomposición constante con supuestos del comportamiento explícitas con respecto a la determinación de las entradas de la habilidad. Esto previene opciones arbitrarias en términos de técnica de la descomposición, sus componentes e interpretación, y también hace los diagonales menos propensos del análisis. Ilustro de manera empírica los puntos principales del análisis que emplea un dataset que contenga la información longitudinal sobre cuentas de la prueba, familia y características cognoscitivas de la escuela, para descomponer la brecha cognoscitiva de la habilidad observada, en la edad de 8 años entre los niños urbanos y rurales en Perú.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mainstream business process modelling techniques promote a design paradigm wherein the activities to be performed within a case, together with their usual execution order, form the backbone of a process model, on top of which other aspects are anchored. This paradigm, while eective in standardised and production-oriented domains, shows some limitations when confronted with processes where case-by-case variations and exceptions are the norm. In this thesis we develop the idea that the eective design of exible process models calls for an alternative modelling paradigm, one in which process models are modularised along key business objects, rather than along activity decompositions. The research follows a design science method, starting from the formulation of a research problem expressed in terms of requirements, and culminating in a set of artifacts that have been devised to satisfy these requirements. The main contributions of the thesis are: (i) a meta-model for object-centric process modelling incorporating constructs for capturing exible processes; (ii) a transformation from this meta-model to an existing activity-centric process modelling language, namely YAWL, showing the relation between object-centric and activity-centric process modelling approaches; and (iii) a Coloured Petri Net that captures the semantics of the proposed meta-model. The meta-model has been evaluated using a framework consisting of a set of work ow patterns. Moreover, the meta-model has been embodied in a modelling tool that has been used to capture two industrial scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To ascertain the effectiveness of object-centered three-dimensional representations for the modeling of corneal surfaces. Methods: Three-dimensional (3D) surface decomposition into series of basis functions including: (i) spherical harmonics, (ii) hemispherical harmonics, and (iii) 3D Zernike polynomials were considered and compared to the traditional viewer-centered representation of two-dimensional (2D) Zernike polynomial expansion for a range of retrospective videokeratoscopic height data from three clinical groups. The data were collected using the Medmont E300 videokeratoscope. The groups included 10 normal corneas with corneal astigmatism less than −0.75 D, 10 astigmatic corneas with corneal astigmatism between −1.07 D and 3.34 D (Mean = −1.83 D, SD = ±0.75 D), and 10 keratoconic corneas. Only data from the right eyes of the subjects were considered. Results: All object-centered decompositions led to significantly better fits to corneal surfaces (in terms of the RMS error values) than the corresponding 2D Zernike polynomial expansions with the same number of coefficients, for all considered corneal surfaces, corneal diameters (2, 4, 6, and 8 mm), and model orders (4th to 10th radial orders) The best results (smallest RMS fit error) were obtained with spherical harmonics decomposition which lead to about 22% reduction in the RMS fit error, as compared to the traditional 2D Zernike polynomials. Hemispherical harmonics and the 3D Zernike polynomials reduced the RMS fit error by about 15% and 12%, respectively. Larger reduction in RMS fit error was achieved for smaller corneral diameters and lower order fits. Conclusions: Object-centered 3D decompositions provide viable alternatives to traditional viewer-centered 2D Zernike polynomial expansion of a corneal surface. They achieve better fits to videokeratoscopic height data and could be particularly suited to the analysis of multiple corneal measurements, where there can be slight variations in the position of the cornea from one map acquisition to the next.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main goal of this research is to design an efficient compression al~ gorithm for fingerprint images. The wavelet transform technique is the principal tool used to reduce interpixel redundancies and to obtain a parsimonious representation for these images. A specific fixed decomposition structure is designed to be used by the wavelet packet in order to save on the computation, transmission, and storage costs. This decomposition structure is based on analysis of information packing performance of several decompositions, two-dimensional power spectral density, effect of each frequency band on the reconstructed image, and the human visual sensitivities. This fixed structure is found to provide the "most" suitable representation for fingerprints, according to the chosen criteria. Different compression techniques are used for different subbands, based on their observed statistics. The decision is based on the effect of each subband on the reconstructed image according to the mean square criteria as well as the sensitivities in human vision. To design an efficient quantization algorithm, a precise model for distribution of the wavelet coefficients is developed. The model is based on the generalized Gaussian distribution. A least squares algorithm on a nonlinear function of the distribution model shape parameter is formulated to estimate the model parameters. A noise shaping bit allocation procedure is then used to assign the bit rate among subbands. To obtain high compression ratios, vector quantization is used. In this work, the lattice vector quantization (LVQ) is chosen because of its superior performance over other types of vector quantizers. The structure of a lattice quantizer is determined by its parameters known as truncation level and scaling factor. In lattice-based compression algorithms reported in the literature the lattice structure is commonly predetermined leading to a nonoptimized quantization approach. In this research, a new technique for determining the lattice parameters is proposed. In the lattice structure design, no assumption about the lattice parameters is made and no training and multi-quantizing is required. The design is based on minimizing the quantization distortion by adapting to the statistical characteristics of the source in each subimage. 11 Abstract Abstract Since LVQ is a multidimensional generalization of uniform quantizers, it produces minimum distortion for inputs with uniform distributions. In order to take advantage of the properties of LVQ and its fast implementation, while considering the i.i.d. nonuniform distribution of wavelet coefficients, the piecewise-uniform pyramid LVQ algorithm is proposed. The proposed algorithm quantizes almost all of source vectors without the need to project these on the lattice outermost shell, while it properly maintains a small codebook size. It also resolves the wedge region problem commonly encountered with sharply distributed random sources. These represent some of the drawbacks of the algorithm proposed by Barlaud [26). The proposed algorithm handles all types of lattices, not only the cubic lattices, as opposed to the algorithms developed by Fischer [29) and Jeong [42). Furthermore, no training and multiquantizing (to determine lattice parameters) is required, as opposed to Powell's algorithm [78). For coefficients with high-frequency content, the positive-negative mean algorithm is proposed to improve the resolution of reconstructed images. For coefficients with low-frequency content, a lossless predictive compression scheme is used to preserve the quality of reconstructed images. A method to reduce bit requirements of necessary side information is also introduced. Lossless entropy coding techniques are subsequently used to remove coding redundancy. The algorithms result in high quality reconstructed images with better compression ratios than other available algorithms. To evaluate the proposed algorithms their objective and subjective performance comparisons with other available techniques are presented. The quality of the reconstructed images is important for a reliable identification. Enhancement and feature extraction on the reconstructed images are also investigated in this research. A structural-based feature extraction algorithm is proposed in which the unique properties of fingerprint textures are used to enhance the images and improve the fidelity of their characteristic features. The ridges are extracted from enhanced grey-level foreground areas based on the local ridge dominant directions. The proposed ridge extraction algorithm, properly preserves the natural shape of grey-level ridges as well as precise locations of the features, as opposed to the ridge extraction algorithm in [81). Furthermore, it is fast and operates only on foreground regions, as opposed to the adaptive floating average thresholding process in [68). Spurious features are subsequently eliminated using the proposed post-processing scheme.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Business process modeling is widely regarded as one of the most popular forms of conceptual modeling. However, little is known about the capabilities and deficiencies of process modeling grammars and how existing deficiencies impact actual process modeling practice. This paper is a first contribution towards a theory-driven, exploratory empirical investigation of the ontological deficiencies of process modeling with the industry standard Business Process Modeling Notation (BPMN). We perform an analysis of BPMN using a theory of ontological expressiveness. Through a series of semi-structured interviews with BPMN adopters we explore empirically the actual use of this grammar. Nine ontological deficiencies related to the practice of modeling with BPMN are identified, for example, the capture of business rules and the specification of process decompositions. We also uncover five contextual factors that impact on the use of process modeling grammars, such as tool support and modeling conventions. We discuss implications for research and practice, highlighting the need for consideration of representational issues and contextual factors in decisions relating to BPMN adoption in organizations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growth of the Web, E-commerce activities are also becoming popular. Product recommendation is an effective way of marketing a product to potential customers. Based on a user’s previous searches, most recommendation methods employ two dimensional models to find relevant items. Such items are then recommended to a user. Further too many irrelevant recommendations worsen the information overload problem for a user. This happens because such models based on vectors and matrices are unable to find the latent relationships that exist between users and searches. Identifying user behaviour is a complex process, and usually involves comparing searches made by him. In most of the cases traditional vector and matrix based methods are used to find prominent features as searched by a user. In this research we employ tensors to find relevant features as searched by users. Such relevant features are then used for making recommendations. Evaluation on real datasets show the effectiveness of such recommendations over vector and matrix based methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

High resolution thermogravimetric analysis (TGA) has attracted much attention in the synthesis of organoclays and its applications. In this study, organoclays were synthesised through ion exchange of a single cationic surfactant for sodium ions, and characterised by methods including X-ray diffraction (XRD), and thermogravimetric analysis (TGA). The changes of surface properties in montmorillonite and organoclays intercalated with surfactant were determined using XRD through the changes in the basal spacing. The thermogravimetric analysis (TGA) was applied in this study to investigate more information of the configuration and structural changes in the organoclays with thermal decomposition. There are four different decompositions steps in differential thermogravimetric (DTG) curves. The obtained TG steps are relevant to the arrangement of the surfactant molecules intercalated in montmorillonite and the thermal analysis indicates the thermal stability of surfactant modified clays. This investigation provides new insights into the properties of organoclays and is important in the synthesis and processing of organoclays for environmental applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Fatty acids are long-chain carboxylic acids that readily produce \[M - H](-) ions upon negative ion electrospray ionization (ESI) and cationic complexes with alkali, alkaline earth, and transition metals in positive ion ESI. In contrast, only one anionic monomeric fatty acid-metal ion complex has been reported in the literature, namely \[M - 2H + (FeCl)-Cl-II](-). In this manuscript, we present two methods to form anionic unsaturated fatty acid-sodium ion complexes (i.e., \[M - 2H + Na](-)). We find that these ions may be generated efficiently by two distinct methods: (1) negative ion ESI of a methanolic solution containing the fatty acid and sodium fluoride forming an \[M - H + NaF](-) ion. Subsequent collision-induced dissociation (CID) results in the desired \[M - 2H + Na](-) ion via the neutral loss of HF. (2) Direct formation of the \[M - 2H + Na](-) ion by negative ion ESI of a methanolic solution containing the fatty acid and sodium hydroxide or bicarbonate. In addition to deprotonation of the carboxylic acid moiety, formation of \[M - 2H + Na](-) ions requires the removal of a proton from the fatty acid acyl chain. We propose that this deprotonation occurs at the bis-allylic position(s) of polyunsaturated fatty acids resulting in the formation of a resonance-stabilized carbanion. This proposal is supported by ab initio calculations, which reveal that removal of a proton from the bis-allylic position, followed by neutral loss of HX (where X = F- and -OH), is the lowest energy dissociation pathway.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Capture of an electron by tetracyanoethylene oxide can initiate a number of decomposition pathways. One of these decompositions yields [(NC)3C]− as the ionic product. Ab initio calculations (at the B3LYP/6-31+G∗ level of theory) indicate that the formation of [(NC)3C]− is initiated by capture of an electron into the LUMO of tetracyanoethylene oxide to yield the anion radical [(NC)2C–O–C(CN)2]−· that undergoes internal nucleophilic substitution to form intermediate [(NC)3C–OCCN]−·. This intermediate dissociates to form [(NC)3C]− (m/z 90) as the ionic product. The radical (NC)3C· has an electron affinity of 4.0 eV (385 kJ mol−1). Ab initio calculations show that [(NC)3C]− is trigonal planar with the negative charge mainly on the nitrogens. A pictorial representation of this structure is the resonance structure formed from three degenerate contributing structures (NC)2–CCN−. The other product of the reaction is nominally (NCCO)·, but there is no definitive experimental evidence to indicate whether this radical survives intact, or decomposes to NC· and CO. The overall process [(NC)2C–O–C(CN)2]−· → [(NC)3C]− + (NCCO)· is calculated to be endothermic by 21 kJ mol−1 with an overall barrier of 268 kJ mol−1.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The non-8-enoate anion undergoes losses of the elements of C3H6, C4H8 and C6H12 on collisional activation, The mechanisms of these processes have been elucidated by a combination of product ion and labelling (H-2 and C-13) studies, together with a neutralisation reionisation mass spectrometric study. These studies allow the following conclusions to be made. (i) The loss of C3H6 involves cyclisation of the enolate anion of non-8-enoic acid to yield the cyclopentyl carboxylate anion and propene. (ii) The loss of 'C4H8' is a charge-remote process (one which proceeds remote from the charged centre) which yields the pent-4-enoate anion, butadiene and dihydrogen. This process co-occurs and competes with complex H scrambling. (iii) The major loss of 'C6H12' occurs primarily by a charge-remote process yielding the acrylate anion, hexa-1,5-diene and dihydrogen, but in this case no H scrambling accompanies the process. (iv) It is argued that the major reason why the two charge-remote processes occur in preference to anion-induced losses of but-l-ene and hex-l-ene from the respective 4- and 2-anions is that although these anions are formed, they have alternative and lower energy fragmentation pathways than those involving the losses of but-l-ene and hex-l-ene; viz. the transient 4-anion undergoes facile proton transfer to yield a more stable anion, whereas the 2-(enolate) anion undergoes preferential cyclisation followed by elimination of propene [see (i) above].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Long-range cross-ring reactions occur when (M - H)(-) ions of methoxy- and ethoxy-C6H4-(-)NCOR (R = H, CH3, C6H5 and CH3O) are subjected to collisional activation, These reactions are generally minor processes: a particular example is the cross-ring elimination p-C2H5O-C6H4-(NCOCH3)-N-- --> [CH3-(p-C2H5O-C6H4-NCO)] --> p-(O--)-C6H4-NCO + C2H4 + CH4. Major processes of these (M - H)(-) ions involve (i) losses of radicals to form stabilised radical anions, e.g. (a) loss of a ring H-. or (b) CH3. (or C2H5.) from the alkoxy group, and (ii) proximity effects when the two substituents are ortho, e.g. loss of CH3OH from o-CH3O-C6H4-(NCHO)-N-- yields deprotonated benzoxazole. Another fragmentation of an arylmethoxyl anion involves loss of CH2O. It is proposed that losses of CH2O are initiated by anionic centres but the actual mechanisms in the cases studied depend upon the substitution pattern of the methoxyanilide: o- and p-methoxyanilides may undergo ipso proton transfer/elimination reactions, whereas the in-analogues undergo proton transfer reactions to yield an o-CH3O substituted aryl carbanion followed by proton transfer from CH3O to the carbanion site with concomitant loss of CH2O.