873 resultados para IBM 1130 (Computador) - Programação
Resumo:
This paper uses dynamic impulse response analysis to investigate the interrelationships among stock price volatility, trading volume, and the leverage effect. Dynamic impulse response analysis is a technique for analyzing the multi-step-ahead characteristics of a nonparametric estimate of the one-step conditional density of a strictly stationary process. The technique is the generalization to a nonlinear process of Sims-style impulse response analysis for linear models. In this paper, we refine the technique and apply it to a long panel of daily observations on the price and trading volume of four stocks actively traded on the NYSE: Boeing, Coca-Cola, IBM, and MMM.
Resumo:
Transcranial magnetic stimulation (TMS) is a widely used, noninvasive method for stimulating nervous tissue, yet its mechanisms of effect are poorly understood. Here we report new methods for studying the influence of TMS on single neurons in the brain of alert non-human primates. We designed a TMS coil that focuses its effect near the tip of a recording electrode and recording electronics that enable direct acquisition of neuronal signals at the site of peak stimulus strength minimally perturbed by stimulation artifact in awake monkeys (Macaca mulatta). We recorded action potentials within ∼1 ms after 0.4-ms TMS pulses and observed changes in activity that differed significantly for active stimulation as compared with sham stimulation. This methodology is compatible with standard equipment in primate laboratories, allowing easy implementation. Application of these tools will facilitate the refinement of next generation TMS devices, experiments and treatment protocols.
Resumo:
Esta comunicação está inserida no desenvolvimento de um projecto de investigação que procura compreender a forma como os professores de matemática podem integrar o uso de materiais tecnológicos em benefício da aprendizagem dos alunos. O projecto centra-se essencialmente nos materiais electrónicos que acompanham os manuais escolares, CD-Roms, eBooks, portais, filmes e conjuntos de outras actividades que apelam ao uso do computador. Procura-se compreender o papel que estes materiais desempenham no processo de ensino aprendizagem, nomeadamente na forma como os professores se apropriam desses materiais e o uso que fazem dos mesmos na sala de aula. Procurar-se-á apresentar nesta comunicação um breve enquadramento teórico do tema em estudo indicando as principais opções assumidas pelos autores.
Resumo:
Este trabajo se centra en la enseñanza y aprendizaje de la distribución normal en un curso introductorio de estadística en la Universidad, y se fundamenta en un marco teórico que plantea el significado institucional y personal de los objetos matemáticos. En particular, se describe el diseño de una experiencia de enseñanza de la distribución normal apoyada en el uso del ordenador y se analizan los avances, dificultades y errores que presentan los alumnos durante el desarrollo de dicha experiencia. En el estudio se presta especial atención a todo lo que implica en la enseñanza de estadística la introducción del computador. Pretendemos aportar información válida sobre la enseñanza/aprendizaje de la estadística en cursos universitarios, que pueda ser completada y ampliada en futuras investigaciones.
Resumo:
A través de varias experiencias, sencillas y fáciles de desarrollar en el aula de clase, se inducirá a los estudiantes para que reconozcan la forma como varían, directa e inversamente dos magnitudes, de tal forma, que logren caracterizarla s; luego con los datos obtenidos de la práctica y con la ayuda de los programas para computador (Excel, Geogebra y TI-NspireCas) se encontrará la tendencia de los datos, acercándolos al concepto de modelación matemática.
Resumo:
Three paradigms for distributed-memory parallel computation that free the application programmer from the details of message passing are compared for an archetypal structured scientific computation -- a nonlinear, structured-grid partial differential equation boundary value problem -- using the same algorithm on the same hardware. All of the paradigms -- parallel languages represented by the Portland Group's HPF, (semi-)automated serial-to-parallel source-to-source translation represented by CAP-Tools from the University of Greenwich, and parallel libraries represented by Argonne's PETSc -- are found to be easy to use for this problem class, and all are reasonably effective in exploiting concurrency after a short learning curve. The level of involvement required by the application programmer under any paradigm includes specification of the data partitioning, corresponding to a geometrically simple decomposition of the domain of the PDE. Programming in SPMD style for the PETSc library requires writing only the routines that discretize the PDE and its Jacobian, managing subdomain-to-processor mappings (affine global-to-local index mappings), and interfacing to library solver routines. Programming for HPF requires a complete sequential implementation of the same algorithm as a starting point, introduction of concurrency through subdomain blocking (a task similar to the index mapping), and modest experimentation with rewriting loops to elucidate to the compiler the latent concurrency. Programming with CAPTools involves feeding the same sequential implementation to the CAPTools interactive parallelization system, and guiding the source-to-source code transformation by responding to various queries about quantities knowable only at runtime. Results representative of "the state of the practice" for a scaled sequence of structured grid problems are given on three of the most important contemporary high-performance platforms: the IBM SP, the SGI Origin 2000, and the CRAYY T3E.
Resumo:
There have been few genuine success stories about industrial use of formal methods. Perhaps the best known and most celebrated is the use of Z by IBM (in collaboration with Oxford University's Programming Research Group) during the development of CICS/ESA (version 3.1). This work was rewarded with the prestigious Queen's Award for Technological Achievement in 1992 and is especially notable for two reasons: 1) because it is a commercial, rather than safety- or security-critical, system and 2) because the claims made about the effectiveness of Z are quantitative as well as qualitative. The most widely publicized claims are: less than half the normal number of customer-reported errors and a 9% savings in the total development costs of the release. This paper provides an independent assessment of the effectiveness of using Z on CICS based on the set of public domain documents. Using this evidence, we believe that the case study was important and valuable, but that the quantitative claims have not been substantiated. The intellectual arguments and rationale for formal methods are attractive, but their widespread commercial use is ultimately dependent upon more convincing quantitative demonstrations of effectiveness. Despite the pioneering efforts of IBM and PRG, there is still a need for rigorous, measurement-based case studies to assess when and how the methods are most effective. We describe how future similar case studies could be improved so that the results are more rigorous and conclusive.
Resumo:
The main purpose of this paper is to provide the core description of the modelling exercise within the Shelf Edge Advection Mortality And Recruitment (SEAMAR) programme. An individual-based model (IBM) was developed for the prediction of year-to-year survival of the early life-history stages of mackerel (Scomber scombrus) in the eastern North Atlantic. The IBM is one of two components of the model system. The first component is a circulation model to provide physical input data for the IBM. The circulation model is a geographical variant of the HAMburg Shelf Ocean Model (HAMSOM). The second component is the IBM, which is an i-space configuration model in which large numbers of individuals are followed as discrete entities to simulate the transport, growth and mortality of mackerel eggs, larvae and post-larvae. Larval and post-larval growth is modelled as a function of length, temperature and food distribution; mortality is modelled as a function of length and absolute growth rate. Each particle is considered as a super-individual representing 10 super(6) eggs at the outset of the simulation, and then declining according to the mortality function. Simulations were carried out for the years 1998-2000. Results showed concentrations of particles at Porcupine Bank and the adjacent Irish shelf, along the Celtic Sea shelf-edge, and in the southern Bay of Biscay. High survival was observed only at Porcupine and the adjacent shelf areas, and, more patchily, around the coastal margin of Biscay. The low survival along the shelf-edge of the Celtic Sea was due to the consistently low estimates of food availability in that area.
Resumo:
An individual-based model (IBM) for the simulation of year-to-year survival during the early life-history stages of the north-east Atlantic stock of mackerel (Scomber scombrus) was developed within the EU funded Shelf-Edge Advection, Mortality and Recruitment (SEAMAR) programme. The IBM included transport, growth and survival and was used to track the passive movement of mackerel eggs, larvae and post-larvae and determine their distribution and abundance after approximately 2 months of drift. One of the main outputs from the IBM, namely distributions and numbers of surviving post-larvae, are compared with field data as recruit (age-0/age-1 juveniles) distribution and abundance for the years 1998, 1999 and 2000. The juvenile distributions show more inter-annual and spatial variability than the modelled distributions of survivors; this may be due to the restriction of using the same initial egg distribution for all 3 yr of simulation. The IBM simulations indicate two main recruitment areas for the north-east Atlantic stock of mackerel, these being Porcupine Bank and the south-eastern Bay of Biscay. These areas correspond to areas of high juvenile catches, although the juveniles generally have a more widespread distribution than the model simulations. The best agreement between modelled data and field data for distribution (juveniles and model survivors) is for the year 1998. The juvenile catches in different representative nursery areas are totalled to give a field abundance index (FAI). This index is compared with a model survivor index (MSI) which is calculated from the total of survivors for the whole spawning season. The MSI compares favourably with the FAI for 1998 and 1999 but not for 2000; in this year, juvenile catches dropped sharply compared with the previous years but there was no equivalent drop in modelled survivors.
Resumo:
Microscopic plastic debris, termed “microplastics”, are of increasing environmental concern. Recent studies have demonstrated that a range of zooplankton, including copepods, can ingest microplastics. Copepods are a globally abundant class of zooplankton that form a key trophic link between primary producers and higher trophic marine organisms. Here we demonstrate that ingestion of microplastics can significantly alter the feeding capacity of the pelagic copepod Calanus helgolandicus. Exposed to 20 μm polystyrene beads (75 microplastics mL–1) and cultured algae ([250 μg C L–1) for 24 h, C. helgolandicus ingested 11% fewer algal cells (P = 0.33) and 40% less carbon biomass (P < 0.01). There was a net downward shift in the mean size of algal prey consumed (P < 0.001), with a 3.6 fold increase in ingestion rate for the smallest size class of algal prey (11.6–12.6 μm), suggestive of postcapture or postingestion rejection. Prolonged exposure to polystyrene microplastics significantly decreased reproductive output, but there were no significant differences in egg production rates, respiration or survival. We constructed a conceptual energetic (carbon) budget showing that microplastic-exposed copepods suffer energetic depletion over time. We conclude that microplastics impede feeding in copepods, which over time could lead to sustained reductions in ingested carbon biomass.
Resumo:
2015 se recordará como el gran año del cambio en los medios de comunicación, especialmente para la prensa. A lo largo del mismo se ha llevado a cabo una enconada lucha entre los editores y Google News, a la que se han sumado Yahoo News, Facebook, LinkedIn, Twitter, etc., para determinar quién debe pagar por los contenidos, quién los creará, quién los distribuirá y quién y cómo los leerá. Nuevos pactos para nuevos lectores. Nuevos acuerdos para reflotar económicamente la prensa. Y todo ello bajo el paraguas de la Ley de Propiedad Intelectual en España y otras similares que se irán publicando o adaptando en Europa. Se ofrece un estudio y análisis de la situación previa al establecimiento de dicha ley y las consecuencias de su aplicación.
Resumo:
La información es el elemento primordial en una Institución, sea del carácter que sea. Fluye de cualquier departamento que la integra y esta información hay que recopilarla, ordenarla, sintetizarla, tratarla y por último y más relevante hay que distribuirla. No se puede retener ni ocultar, se debe canalizar para que todos los integrantes de esa empresa la conozcan, la entiendan, la asuman y la hagan suya. Es así como funcionan bien los organismos conociendo lo que trabajan, sus objetivos, sus límites, sus avances y sus logros. Los gabinetes de comunicación son el motor de la documentación. Gestionan los contenidos, crean archivos de datos y documentos y difunden la información, no solo entre los diferentes componentes de la empresa sino también entre los distintos estamentos de la sociedad, proyectando al exterior la imagen de la Institución para la que trabajan.
Resumo:
Aim: To study the relation between visual impairment and ability to care for oneself or a dependant in older people with age related macular degeneration (AMD). Method: Cross sectional study of older people with visual impairment due to AMD in a specialised retinal service clinic. 199 subjects who underwent visual function assessment (fully corrected distance and near acuity and contrast sensitivity in both eyes), followed by completion of a package of questionnaires dealing with general health status (SF36), visual functioning (Daily Living Tasks Dependent on Vision, DLTV) and ability to care for self or provide care to others. The outcome measure was self reported ability to care for self and others. Three levels of self reported ability to care were identified—inability to care for self (level 1), ability to care for self but not others (level 2), and ability to care for self and others (level 3). Results: People who reported good general health status and visual functioning (that is, had high scores on SF36 and DLTV) were more likely to state that they were able to care for self and others. Similarly people with good vision in the better seeing eye were more likely to report ability to care for self and others. People with a distance visual acuity (DVA) worse than 0.4 logMAR (Snellen 6/15) had less than 50% probability of assigning themselves to care level 3 and those with DVA worse than 1.0 logMAR (Snellen 6/60) had a probability of greater than 50% or for assigning themselves to care level 1. Regression analyses with level of care as the dependent variable and demographic factors, DLTV subscales, and SF36 dimensions as the explanatory variables confirmed that the DLTV subscale 1 was the most important variable in the transition from care level 3 to care level 2. The regression analyses also confirmed that the DLTV subscale 2 was the most important in the transition from care level 3 to care level 1. Conclusions: Ability to care for self and dependants has a strong relation with self reported visual functioning and quality of life and is adversely influenced by visual impairment. The acuity at which the balance of probability shifts in the direction of diminished ability to care for self or others is lower than the level set by social care agencies for provision of support. These findings have implications for those involved with visual rehabilitation and for studies of the cost effectiveness of interventions in AMD.