991 resultados para Hecke Algebra
Resumo:
This article describes the design and implementation of computer-aided tool called Relational Algebra Translator (RAT) in data base courses, for the teaching of relational algebra. There was a problem when introducing the relational algebra topic in the course EIF 211 Design and Implementation of Databases, which belongs to the career of Engineering in Information Systems of the National University of Costa Rica, because students attending this course were lacking profound mathematical knowledge, which led to a learning problem, being this an important subject to understand what the data bases search and request do RAT comes along to enhance the teaching-learning process.It introduces the architectural and design principles required for its implementation, such as: the language symbol table, the gramatical rules and the basic algorithms that RAT uses to translate from relational algebra to SQL language. This tool has been used for one periods and has demonstrated to be effective in the learning-teaching process. This urged investigators to publish it in the web site: www.slinfo.una.ac.cr in order for this tool to be used in other university courses.
Resumo:
There is a long history of debate around mathematics standards, reform efforts, and accountability. This research identified ways that national expectations and context drive local implementation of mathematics reform efforts and identified the external and internal factors that impact teachers’ acceptance or resistance to policy implementation at the local level. This research also adds to the body of knowledge about acceptance and resistance to policy implementation efforts. This case study involved the analysis of documents to provide a chronological perspective, assess the current state of the District’s mathematics reform, and determine the District’s readiness to implement the Common Core Curriculum. The school system in question has continued to struggle with meeting the needs of all students in Algebra 1. Therefore, the results of this case study will be useful to the District’s leaders as they include the compilation and analysis of a decade’s worth of data specific to Algebra 1.
Resumo:
Reasoning systems have reached a high degree of maturity in the last decade. However, even the most successful systems are usually not general purpose problem solvers but are typically specialised on problems in a certain domain. The MathWeb SOftware Bus (Mathweb-SB) is a system for combining reasoning specialists via a common osftware bus. We described the integration of the lambda-clam systems, a reasoning specialist for proofs by induction, into the MathWeb-SB. Due to this integration, lambda-clam now offers its theorem proving expertise to other systems in the MathWeb-SB. On the other hand, lambda-clam can use the services of any reasoning specialist already integrated. We focus on the latter and describe first experimnents on proving theorems by induction using the computational power of the MAPLE system within lambda-clam.
Resumo:
International audience
Resumo:
During the past decade, there has been a dramatic increase by postsecondary institutions in providing academic programs and course offerings in a multitude of formats and venues (Biemiller, 2009; Kucsera & Zimmaro, 2010; Lang, 2009; Mangan, 2008). Strategies pertaining to reapportionment of course-delivery seat time have been a major facet of these institutional initiatives; most notably, within many open-door 2-year colleges. Often, these enrollment-management decisions are driven by the desire to increase market-share, optimize the usage of finite facility capacity, and contain costs, especially during these economically turbulent times. So, while enrollments have surged to the point where nearly one in three 18-to-24 year-old U.S. undergraduates are community college students (Pew Research Center, 2009), graduation rates, on average, still remain distressingly low (Complete College America, 2011). Among the learning-theory constructs related to seat-time reapportionment efforts is the cognitive phenomenon commonly referred to as the spacing effect, the degree to which learning is enhanced by a series of shorter, separated sessions as opposed to fewer, more massed episodes. This ex post facto study explored whether seat time in a postsecondary developmental-level algebra course is significantly related to: course success; course-enrollment persistence; and, longitudinally, the time to successfully complete a general-education-level mathematics course. Hierarchical logistic regression and discrete-time survival analysis were used to perform a multi-level, multivariable analysis of a student cohort (N = 3,284) enrolled at a large, multi-campus, urban community college. The subjects were retrospectively tracked over a 2-year longitudinal period. The study found that students in long seat-time classes tended to withdraw earlier and more often than did their peers in short seat-time classes (p < .05). Additionally, a model comprised of nine statistically significant covariates (all with p-values less than .01) was constructed. However, no longitudinal seat-time group differences were detected nor was there sufficient statistical evidence to conclude that seat time was predictive of developmental-level course success. A principal aim of this study was to demonstrate—to educational leaders, researchers, and institutional-research/business-intelligence professionals—the advantages and computational practicability of survival analysis, an underused but more powerful way to investigate changes in students over time.
Resumo:
In this article, we describe a novel methodology to extract semantic characteristics from protein structures using linear algebra in order to compose structural signature vectors which may be used efficiently to compare and classify protein structures into fold families. These signatures are built from the pattern of hydrophobic intrachain interactions using Singular Value Decomposition (SVD) and Latent Semantic Indexing (LSI) techniques. Considering proteins as documents and contacts as terms, we have built a retrieval system which is able to find conserved contacts in samples of myoglobin fold family and to retrieve these proteins among proteins of varied folds with precision of up to 80%. The classifier is a web tool available at our laboratory website. Users can search for similar chains from a specific PDB, view and compare their contact maps and browse their structures using a JMol plug-in.
Resumo:
Il modello ΛCDM è il modello cosmologico più semplice, ma finora più efficace, per descrivere l'evoluzione dell'universo. Esso si basa sulla teoria della Relatività Generale di Einstein e fornisce una spiegazione dell'espansione accelerata dell'universo introducendo la costante cosmologica Λ, che rappresenta il contributo della cosiddetta energia oscura, un'entità di cui ben poco si sa con certezza. Sono stati tuttavia proposti modelli teorici alternativi che descrivono gli effetti di questa quantità misteriosa, introducendo ad esempio gradi di libertà aggiuntivi, come nella teoria di Horndeski. L'obiettivo principale di questa testi è quello di studiare questi modelli tramite il tensor computer algebra xAct. In particolare, il nostro scopo sarà quello di implementare una procedura universale che permette di derivare, a partire dall'azione, le equazioni del moto e l'evoluzione temporale di qualunque modello generico.
Resumo:
We present measurements of J/psi yields in d + Au collisions at root S(NN) = 200 GeV recorded by the PHENIX experiment and compare them with yields in p + p collisions at the same energy per nucleon-nucleon collision. The measurements cover a large kinematic range in J/psi rapidity (-2.2 < y < 2.4) with high statistical precision and are compared with two theoretical models: one with nuclear shadowing combined with final state breakup and one with coherent gluon saturation effects. In order to remove model dependent systematic uncertainties we also compare the data to a simple geometric model. The forward rapidity data are inconsistent with nuclear modifications that are linear or exponential in the density weighted longitudinal thickness, such as those from the final state breakup of the bound state.
Resumo:
Measurements of electrons from the decay of open-heavy-flavor mesons have shown that the yields are suppressed in Au+Au collisions compared to expectations from binary-scaled p+p collisions. These measurements indicate that charm and bottom quarks interact with the hot dense matter produced in heavy-ion collisions much more than expected. Here we extend these studies to two-particle correlations where one particle is an electron from the decay of a heavy-flavor meson and the other is a charged hadron from either the decay of the heavy meson or from jet fragmentation. These measurements provide more detailed information about the interactions between heavy quarks and the matter, such as whether the modification of the away-side-jet shape seen in hadron-hadron correlations is present when the trigger particle is from heavy-meson decay and whether the overall level of away-side-jet suppression is consistent. We statistically subtract correlations of electrons arising from background sources from the inclusive electron-hadron correlations and obtain two-particle azimuthal correlations at root s(NN) = 200 GeV between electrons from heavy-flavor decay with charged hadrons in p+p and also first results in Au+Au collisions. We find the away-side-jet shape and yield to be modified in Au+Au collisions compared to p+p collisions.
Resumo:
The PHENIX experiment at the Relativistic Heavy Ion Collider has measured the invariant differential cross section for production of K(S)(0), omega, eta', and phi mesons in p + p collisions at root s 200 GeV. Measurements of omega and phi production in different decay channels give consistent results. New results for the omega are in agreement with previously published data and extend the measured p(T) coverage. The spectral shapes of all hadron transverse momentum distributions measured by PHENIX are well described by a Tsallis distribution functional form with only two parameters, n and T, determining the high-p(T) and characterizing the low-p(T) regions of the spectra, respectively. The values of these parameters are very similar for all analyzed meson spectra, but with a lower parameter T extracted for protons. The integrated invariant cross sections calculated from the fitted distributions are found to be consistent with existing measurements and with statistical model predictions.
Resumo:
The PHENIX experiment at the Relativistic Heavy Ion Collider has performed systematic measurements of phi meson production in the K(+)K(-) decay channel at midrapidity in p + p, d + Au, Cu + Cu, and Au + Au collisions at root s(NN) = 200 GeV. Results are presented on the phi invariant yield and the nuclear modification factor R(AA) for Au + Au and Cu + Cu, and R(dA) for d + Au collisions, studied as a function of transverse momentum (1 < p(T) < 7 GeV/c) and centrality. In central and midcentral Au + Au collisions, the R(AA) of phi exhibits a suppression relative to expectations from binary scaled p + p results. The amount of suppression is smaller than that of the pi(0) and the. in the intermediate p(T) range (2-5 GeV/c), whereas, at higher p(T), the phi, pi(0), and. show similar suppression. The baryon (proton and antiproton) excess observed in central Au + Au collisions at intermediate p(T) is not observed for the phi meson despite the similar masses of the proton and the phi. This suggests that the excess is linked to the number of valence quarks in the hadron rather than its mass. The difference gradually disappears with decreasing centrality, and, for peripheral collisions, the R(AA) values for both particle species are consistent with binary scaling. Cu + Cu collisions show the same yield and suppression as Au + Au collisions for the same number of N(part). The R(dA) of phi shows no evidence for cold nuclear effects within uncertainties.
Resumo:
Large parity-violating longitudinal single-spin asymmetries A(L)(e+) = 0.86(-0.14)(+0.30) and Ae(L)(e-) = 0.88(-0.71)(+0.12) are observed for inclusive high transverse momentum electrons and positrons in polarized p + p collisions at a center-of-mass energy of root s = 500 GeV with the PHENIX detector at RHIC. These e(+/-) come mainly from the decay of W(+/-) and Z(0) bosons, and their asymmetries directly demonstrate parity violation in the couplings of the W(+/-) to the light quarks. The observed electron and positron yields were used to estimate W(+/-) boson production cross sections for the e(+/-) channels of sigma(pp -> W(+)X) X BR(W(+) -> e(+) nu(e)) = 144.1 +/- 21.2(stat)(-10.3)(+3.4)(syst) +/- 21.6(norm) pb, and sigma(pp -> W(-)X) X BR(W(-) -> e(-) (nu) over bar (e)) = 3.17 +/- 12.1(stat)(-8.2)(+10.1)(syst) +/- 4.8(norm) pb.
Resumo:
Measurements of double-helicity asymmetries in inclusive hadron production in polarized p + p collisions are sensitive to helicity-dependent parton distribution functions, in particular, to the gluon helicity distribution, Delta g. This study focuses on the extraction of the double-helicity asymmetry in eta production ((p) over right arrow + (p) over right arrow -> eta + X), the eta cross section, and the eta/pi(0) cross section ratio. The cross section and ratio measurements provide essential input for the extraction of fragmentation functions that are needed to access the helicity-dependent parton distribution functions.
Resumo:
We report the first measurement of transverse single-spin asymmetries in J/psi production from transversely polarized p + p collisions at root s = 200 GeV with data taken by the PHENIX experiment in 2006 and 2008. The measurement was performed over the rapidity ranges 1.2 < vertical bar y vertical bar < 2.2 and vertical bar y vertical bar < 0.35 for transverse momenta up to 6 GeV/c. J/psi production at the Relativistic Heavy Ion Collider is dominated by processes involving initial-state gluons, and transverse single-spin asymmetries of the J/psi can provide access to gluon dynamics within the nucleon. Such asymmetries may also shed light on the long-standing question in QCD of the J/psi production mechanism. Asymmetries were obtained as a function of J/psi transverse momentum and Feynman-x, with a value of -0.086 +/- 0.026(stat) +/- 0.003(syst) in the forward region. This result suggests possible nonzero trigluon correlation functions in transversely polarized protons and, if well defined in this reaction, a nonzero gluon Sivers distribution function.
Resumo:
Correlations of charged hadrons of 1< p(T) < 10 Gev/c with high pT direct photons and pi(0) mesons in the range 5< p(T) < 15 Gev/c are used to study jet fragmentation in the gamma + jet and dijet channels, respectively. The magnitude of the partonic transverse momentum, k(T), is obtained by comparing to a model incorporating a Gaussian kT smearing. The sensitivity of the associated charged hadron spectra to the underlying fragmentation function is tested and the data are compared to calculations using recent global fit results. The shape of the direct photon-associated hadron spectrum as well as its charge asymmetry are found to be consistent with a sample dominated by quark-gluon Compton scattering. No significant evidence of fragmentation photon correlated production is observed within experimental uncertainties.