922 resultados para Computer arithmetic and logic units.
Resumo:
Abstract: As one of the newest art forms available to young people, gaming has become an increasing influence on young people’s education, even if not used in a classroom environment. This talk aims to explore examples of how video games have changed how young people understand and learn about certain subjects, with particular focus on how the indie title Minecraft allows them to learn about the world of Computer Science and how groups are looking to forward the cause of education though games.
Resumo:
As ubiquitous systems have moved out of the lab and into the world the need to think more systematically about how there are realised has grown. This talk will present intradisciplinary work I have been engaged in with other computing colleagues on how we might develop more formal models and understanding of ubiquitous computing systems. The formal modelling of computing systems has proved valuable in areas as diverse as reliability, security and robustness. However, the emergence of ubiquitous computing raises new challenges for formal modelling due to their contextual nature and dependence on unreliable sensing systems. In this work we undertook an exploration of modelling an example ubiquitous system called the Savannah game using the approach of bigraphical rewriting systems. This required an unusual intra-disciplinary dialogue between formal computing and human- computer interaction researchers to model systematically four perspectives on Savannah: computational, physical, human and technical. Each perspective in turn drew upon a range of different modelling traditions. For example, the human perspective built upon previous work on proxemics, which uses physical distance as a means to understand interaction. In this talk I hope to show how our model explains observed inconsistencies in Savannah and ex- tend it to resolve these. I will then reflect on the need for intradisciplinary work of this form and the importance of the bigraph diagrammatic form to support this form of engagement. Speaker Biography Tom Rodden Tom Rodden (rodden.info) is a Professor of Interactive Computing at the University of Nottingham. His research brings together a range of human and technical disciplines, technologies and techniques to tackle the human, social, ethical and technical challenges involved in ubiquitous computing and the increasing used of personal data. He leads the Mixed Reality Laboratory (www.mrl.nott.ac.uk) an interdisciplinary research facility that is home of a team of over 40 researchers. He founded and currently co-directs the Horizon Digital Economy Research Institute (www.horizon.ac.uk), a university wide interdisciplinary research centre focusing on ethical use of our growing digital footprint. He has previously directed the EPSRC Equator IRC (www.equator.ac.uk) a national interdisciplinary research collaboration exploring the place of digital interaction in our everyday world. He is a fellow of the British Computer Society and the ACM and was elected to the ACM SIGCHI Academy in 2009 (http://www.sigchi.org/about/awards/).
Resumo:
How do resource booms affect human capital accumulation? We exploit time and spatial variation generated by the commodity boom across local governments in Peru to measure the effect of natural resources on human capital formation. We explore the effect of both mining production and tax revenues on test scores, finding a substantial and statistically significant effect for the latter. Transfers to local governments from mining tax revenues are linked to an increase in math test scores of around 0.23 standard deviations. We find that the hiring of permanent teachers as well as the increases in parental employment and improvements in health outcomes of adults and children are plausible mechanisms for such large effect on learning. These findings suggest that redistributive policies could facilitate the accumulation of human capital in resource abundant developing countries as a way to avoid the natural resources curse.
Resumo:
In the present PhD thesis we studied the solid-phase peptide synthesis of antimicrobial peptides derived from the lead peptides BP100 and BPC194. First, peptides derived from BP100 containing D-amino acids at different positions of the sequences were prepared. Moreover, peptidotriazoles derived from BP100 were also synthesized containing the triazole ring at the side-chain of different amino acids. Then, we proceeded to perform studies for the synthesis of multivalent peptides derived from BPC194. To achieve this objective, the synthesis of cyclic peptides containig a triazole ring at amino acids side-chain with different elongations was carried out. Finally, we prepared various carbopeptides containing 2 and 4 units of BP100 and/or its derivatives. The evaluation of the biological activity allowed the identification of active sequences against the economically important phytopathogenic bacteria and fungi and not toxic against eukaryotic cells.
Resumo:
La tesis se centra en la Visión por Computador y, más concretamente, en la segmentación de imágenes, la cual es una de las etapas básicas en el análisis de imágenes y consiste en la división de la imagen en un conjunto de regiones visualmente distintas y uniformes considerando su intensidad, color o textura. Se propone una estrategia basada en el uso complementario de la información de región y de frontera durante el proceso de segmentación, integración que permite paliar algunos de los problemas básicos de la segmentación tradicional. La información de frontera permite inicialmente identificar el número de regiones presentes en la imagen y colocar en el interior de cada una de ellas una semilla, con el objetivo de modelar estadísticamente las características de las regiones y definir de esta forma la información de región. Esta información, conjuntamente con la información de frontera, es utilizada en la definición de una función de energía que expresa las propiedades requeridas a la segmentación deseada: uniformidad en el interior de las regiones y contraste con las regiones vecinas en los límites. Un conjunto de regiones activas inician entonces su crecimiento, compitiendo por los píxeles de la imagen, con el objetivo de optimizar la función de energía o, en otras palabras, encontrar la segmentación que mejor se adecua a los requerimientos exprsados en dicha función. Finalmente, todo esta proceso ha sido considerado en una estructura piramidal, lo que nos permite refinar progresivamente el resultado de la segmentación y mejorar su coste computacional. La estrategia ha sido extendida al problema de segmentación de texturas, lo que implica algunas consideraciones básicas como el modelaje de las regiones a partir de un conjunto de características de textura y la extracción de la información de frontera cuando la textura es presente en la imagen. Finalmente, se ha llevado a cabo la extensión a la segmentación de imágenes teniendo en cuenta las propiedades de color y textura. En este sentido, el uso conjunto de técnicas no-paramétricas de estimación de la función de densidad para la descripción del color, y de características textuales basadas en la matriz de co-ocurrencia, ha sido propuesto para modelar adecuadamente y de forma completa las regiones de la imagen. La propuesta ha sido evaluada de forma objetiva y comparada con distintas técnicas de integración utilizando imágenes sintéticas. Además, se han incluido experimentos con imágenes reales con resultados muy positivos.
Resumo:
Consider the statement "this project should cost X and has risk of Y". Such statements are used daily in industry as the basis for making decisions. The work reported here is part of a study aimed at providing a rational and pragmatic basis for such statements. Of particular interest are predictions made in the requirements and early phases of projects. A preliminary model has been constructed using Bayesian Belief Networks and in support of this, a programme to collect and study data during the execution of various software development projects commenced in May 2002. The data collection programme is undertaken under the constraints of a commercial industrial regime of multiple concurrent small to medium scale software development projects. Guided by pragmatism, the work is predicated on the use of data that can be collected readily by project managers; including expert judgements, effort, elapsed times and metrics collected within each project.
Resumo:
The International System of Units (SI) is founded on seven base units, the metre, kilogram, second, ampere, kelvin, mole and candela corresponding to the seven base quantities of length, mass, time, electric current, thermodynamic temperature, amount of substance and luminous intensity. At its 94th meeting in October 2005, the International Committee for Weights and Measures (CIPM) adopted a recommendation on preparative steps towards redefining the kilogram, ampere, kelvin and mole so that these units are linked to exactly known values of fundamental constants. We propose here that these four base units should be given new definitions linking them to exactly defined values of the Planck constant h, elementary charge e, Boltzmann constant k and Avogadro constant NA, respectively. This would mean that six of the seven base units of the SI would be defined in terms of true invariants of nature. In addition, not only would these four fundamental constants have exactly defined values but also the uncertainties of many of the other fundamental constants of physics would be either eliminated or appreciably reduced. In this paper we present the background and discuss the merits of these proposed changes, and we also present possible wordings for the four new definitions. We also suggest a novel way to define the entire SI explicitly using such definitions without making any distinction between base units and derived units. We list a number of key points that should be addressed when the new definitions are adopted by the General Conference on Weights and Measures (CGPM), possibly by the 24th CGPM in 2011, and we discuss the implications of these changes for other aspects of metrology.
Resumo:
Gallaborane (GaBH6, 1), synthesized by the metathesis of LiBH4 with [H2GaCl]n at ca. 250 K, has been characterized by chemical analysis and by its IR and 1H and 11B NMR spectra. The IR spectrum of the vapor at low pressure implies the presence of only one species, viz. H2Ga(μ-H)2BH2, with a diborane-like structure conforming to C2v symmetry. The structure of this molecule has been determined by gas-phase electron diffraction (GED) measurements afforced by the results of ab initio molecular orbital calculations. Hence the principal distances (rα in Å) and angles ( α in deg) are as follows: r(Ga•••B), 2.197(3); r(Ga−Ht), 1.555(6); r(Ga−Hb), 1.800(6); r(B−Ht), 1.189(7); r(B−Hb), 1.286(7); Hb−Ga−Hb, 71.6(4); and Hb−B−Hb, 110.0(5) (t = terminal, b = bridging). Aggregation of the molecules occurs in the condensed phases. X-ray crystallographic studies of a single crystal at 110 K reveal a polymeric network with helical chains made up of alternating pseudotetrahedral GaH4 and BH4 units linked through single hydrogen bridges; the average Ga•••B distance is now 2.473(7) Å. The compound decomposes in the condensed phases at temperatures exceeding ca. 240 K with the formation of elemental Ga and H2 and B2H6. The reactions with NH3, Me3N, and Me3P are also described.
Resumo:
Two commercial enzyme products, Depol 40 (D) and Liquicell 2500 (L), were characterised from a biochemical standpoint and their potential to improve rumen degradation of forages was evaluated in vitro. Enzyme activities were determined at pH 5.5 and 39 degreesC. Analysis of the enzyme activities indicated that L contained higher xylanase and endoglucanase, but lower exoglucanase, pectinase and alpha-amylase activities than D. The Reading Pressure Technique (RPT) was used to investigate the effect of enzyme addition on the in vitro gas production (GP) and organic matter degradation (OMD) of alfalfa (Medicago sativa L.) stems and leaves. A completely randomised design with factorial arrangement of treatments was used. Both alfalfa fractions were untreated or treated with each enzyme at four levels, 20 h before incubation with rumen fluid. Each level of enzyme provided similar amounts of filter paper (D1, L1), endoglucanase (D2, L2), alpha-L-arabinofuranosidase (D3, L3) and xylanase units (D4, L4) per gram forage DM. Enzymes increased the initial OMD in both fractions, with improvements of up to 15% in leaves (D4) and 8% in stems (L2) after 12 h incubation. All enzyme treatments increased the extent of degradation (96 h incubation) in the leaf fractions, but only L2 increased final OMD in the stems. Direct hydrolysis of forage fractions during the pre-treatment period did not fully account for the magnitude of the increases in OMD, suggesting that the increase in rate of degradation was achieved through a combined effect of direct enzyme hydrolysis and synergistic action between the exogenous (applied) and endogenous (rumen) enzymes. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Bis(o-hydroxyacetophenone)nickel(II) dihydrate, on reaction with 1,3-pentanediamine, yields a bis-chelate complex [NiL2]center dot 2H(2)O (1) of mono-condensed tridentate Schiff base ligand HL {2-[1-(3-aminopentylimino)ethyl]phenol}. The Schiff base has been freed from the complex by precipitating the Nil, as a dimethylglyoximato complex. HL reacts smoothly with Ni(SCN)(2)center dot 4H(2)O furnishing the complex [NiL(NCS)] (2) and with CuCl2 center dot 2H(2)O in the presence of NaN3 or NH4SCN producing [CuL(N-3)](2) (3) or [CuL(NCS)] (4). On the other hand, upon reaction with Cu(ClO4)(2)center dot 6H(2)O and Cu(NO3)(2)center dot 3H(2)O, the Schiff base undergoes hydrolysis to yield ternary complexes [Cu(hap)(pn)(H2O)]ClO4 (5) and [Cu(hap)(pn)(H2O)]NO3 (6), respectively (Hhap = o-hydroxyacetophenone and pn = 1,3-pentanediamine). The ligand HL undergoes hydrolysis also on reaction with Ni(ClO4)(2)center dot 6H(2)O or Ni(NO3)(2)center dot 6H(2)O to yield [Ni(hap)(2)] (7). The structures of the complexes 2, 3, 5, 6, and 7 have been confirmed by single-crystal X-ray analysis. In complex 2, Ni-II possesses square-planar geometry, being coordinated by the tridentate mono-negative Schiff base, L and the isothiocyanate group. The coordination environment around Cu-II in complex 3 is very similar to that in complex 2 but here two units are joined together by end-on, axial-equatorial azide bridges to result in a dimer in which the geometry around Cu-II is square pyramidal. In both 5 and 6, the Cu-II atoms display the square-pyramidal environment; the equatorial sites being coordinated by the two amine groups of 1,3-pentanediamine and two oxygen atoms of o-hydroxyacetophenone. The axial site is coordinated by a water molecule. Complex 7 is a square-planar complex with the Ni atom bonded to four oxygen atoms from two hap moieties. The mononuclear units of 2 and dinuclear units of 3 are linked by strong hydrogen bonds to form a one-dimensional network. The mononuclear units of 5 and 6 are joined together to form a dimer by very strong hydrogen bonds through the coordinated water molecule. These dimers are further involved in hydrogen bonding with the respective counteranions to form 2-D net-like open frameworks. ((C) Wiley-VCH Verlag GmbH & Co. KGaA, 69451 Weinheim, Germany, 2008).
Resumo:
Background: Shifting gaze and attention ahead of the hand is a natural component in the performance of skilled manual actions. Very few studies have examined the precise co-ordination between the eye and hand in children with Developmental Coordination Disorder (DCD). Methods This study directly assessed the maturity of eye-hand co-ordination in children with DCD. A double-step pointing task was used to investigate the coupling of the eye and hand in 7-year-old children with and without DCD. Sequential targets were presented on a computer screen, and eye and hand movements were recorded simultaneously. Results There were no differences between typically developing (TD) and DCD groups when completing fast single-target tasks. There were very few differences in the completion of the first movement in the double-step tasks, but differences did occur during the second sequential movement. One factor appeared to be the propensity for the DCD children to delay their hand movement until some period after the eye had landed on the target. This resulted in a marked increase in eye-hand lead during the second movement, disrupting the close coupling and leading to a slower and less accurate hand movement among children with DCD. Conclusions In contrast to skilled adults, both groups of children preferred to foveate the target prior to initiating a hand movement if time allowed. The TD children, however, were more able to reduce this foveation period and shift towards a feedforward mode of control for hand movements. The children with DCD persevered with a look-then-move strategy, which led to an increase in error. For the group of DCD children in this study, there was no evidence of a problem in speed or accuracy of simple movements, but there was a difficulty in concatenating the sequential shifts of gaze and hand required for the completion of everyday tasks or typical assessment items.
Resumo:
This paper formally derives a new path-based neural branch prediction algorithm (FPP) into blocks of size two for a lower hardware solution while maintaining similar input-output characteristic to the algorithm. The blocked solution, here referred to as B2P algorithm, is obtained using graph theory and retiming methods. Verification approaches were exercised to show that prediction performances obtained from the FPP and B2P algorithms differ within one mis-prediction per thousand instructions using a known framework for branch prediction evaluation. For a chosen FPGA device, circuits generated from the B2P algorithm showed average area savings of over 25% against circuits for the FPP algorithm with similar time performances thus making the proposed blocked predictor superior from a practical viewpoint.