774 resultados para Could computing
Resumo:
En el profundo mar azul, vive Swam un pez que quería un deseo y cada deseo se hace realidad. Desea un castillo, un coche, un caballo, desea volar, esquiar. Pero un día, quiere ser igual que los otros peces.
Resumo:
Libro de texto de informática aprobado por el OCR (Oxford Cambridge and RSA Examinations) para la especificación GCE (General Certificate of Education) de nivel A (enseñanza secundaria, bachillerato). Está dividido en tres secciones con teoría, ejercicios de examen (con consejos sobre cómo realizarlos y cómo preparar la prueba), y técnicas de examen. Los contenidos temáticos que cubre son: fundamentos de la informática, técnicas de programación y métodos lógicos, teoría de computación avanzada.
Resumo:
Resumen basado en el de la publicaci??n
Resumo:
Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios
Resumo:
We present algorithms for computing approximate distance functions and shortest paths from a generalized source (point, segment, polygonal chain or polygonal region) on a weighted non-convex polyhedral surface in which obstacles (represented by polygonal chains or polygons) are allowed. We also describe an algorithm for discretizing, by using graphics hardware capabilities, distance functions. Finally, we present algorithms for computing discrete k-order Voronoi diagrams
Resumo:
Different procedures to obtain atom condensed Fukui functions are described. It is shown how the resulting values may differ depending on the exact approach to atom condensed Fukui functions. The condensed Fukui function can be computed using either the fragment of molecular response approach or the response of molecular fragment approach. The two approaches are nonequivalent; only the latter approach corresponds in general with a population difference expression. The Mulliken approach does not depend on the approach taken but has some computational drawbacks. The different resulting expressions are tested for a wide set of molecules. In practice one must make seemingly arbitrary choices about how to compute condensed Fukui functions, which suggests questioning the role of these indicators in conceptual density-functional theory
Resumo:
An analytic method to evaluate nuclear contributions to electrical properties of polyatomic molecules is presented. Such contributions control changes induced by an electric field on equilibrium geometry (nuclear relaxation contribution) and vibrational motion (vibrational contribution) of a molecular system. Expressions to compute the nuclear contributions have been derived from a power series expansion of the potential energy. These contributions to the electrical properties are given in terms of energy derivatives with respect to normal coordinates, electric field intensity or both. Only one calculation of such derivatives at the field-free equilibrium geometry is required. To show the useful efficiency of the analytical evaluation of electrical properties (the so-called AEEP method), results for calculations on water and pyridine at the SCF/TZ2P and the MP2/TZ2P levels of theory are reported. The results obtained are compared with previous theoretical calculations and with experimental values
Resumo:
The human visual ability to perceive depth looks like a puzzle. We perceive three-dimensional spatial information quickly and efficiently by using the binocular stereopsis of our eyes and, what is mote important the learning of the most common objects which we achieved through living. Nowadays, modelling the behaviour of our brain is a fiction, that is why the huge problem of 3D perception and further, interpretation is split into a sequence of easier problems. A lot of research is involved in robot vision in order to obtain 3D information of the surrounded scene. Most of this research is based on modelling the stereopsis of humans by using two cameras as if they were two eyes. This method is known as stereo vision and has been widely studied in the past and is being studied at present, and a lot of work will be surely done in the future. This fact allows us to affirm that this topic is one of the most interesting ones in computer vision. The stereo vision principle is based on obtaining the three dimensional position of an object point from the position of its projective points in both camera image planes. However, before inferring 3D information, the mathematical models of both cameras have to be known. This step is known as camera calibration and is broadly describes in the thesis. Perhaps the most important problem in stereo vision is the determination of the pair of homologue points in the two images, known as the correspondence problem, and it is also one of the most difficult problems to be solved which is currently investigated by a lot of researchers. The epipolar geometry allows us to reduce the correspondence problem. An approach to the epipolar geometry is describes in the thesis. Nevertheless, it does not solve it at all as a lot of considerations have to be taken into account. As an example we have to consider points without correspondence due to a surface occlusion or simply due to a projection out of the camera scope. The interest of the thesis is focused on structured light which has been considered as one of the most frequently used techniques in order to reduce the problems related lo stereo vision. Structured light is based on the relationship between a projected light pattern its projection and an image sensor. The deformations between the pattern projected into the scene and the one captured by the camera, permits to obtain three dimensional information of the illuminated scene. This technique has been widely used in such applications as: 3D object reconstruction, robot navigation, quality control, and so on. Although the projection of regular patterns solve the problem of points without match, it does not solve the problem of multiple matching, which leads us to use hard computing algorithms in order to search the correct matches. In recent years, another structured light technique has increased in importance. This technique is based on the codification of the light projected on the scene in order to be used as a tool to obtain an unique match. Each token of light is imaged by the camera, we have to read the label (decode the pattern) in order to solve the correspondence problem. The advantages and disadvantages of stereo vision against structured light and a survey on coded structured light are related and discussed. The work carried out in the frame of this thesis has permitted to present a new coded structured light pattern which solves the correspondence problem uniquely and robust. Unique, as each token of light is coded by a different word which removes the problem of multiple matching. Robust, since the pattern has been coded using the position of each token of light with respect to both co-ordinate axis. Algorithms and experimental results are included in the thesis. The reader can see examples 3D measurement of static objects, and the more complicated measurement of moving objects. The technique can be used in both cases as the pattern is coded by a single projection shot. Then it can be used in several applications of robot vision. Our interest is focused on the mathematical study of the camera and pattern projector models. We are also interested in how these models can be obtained by calibration, and how they can be used to obtained three dimensional information from two correspondence points. Furthermore, we have studied structured light and coded structured light, and we have presented a new coded structured light pattern. However, in this thesis we started from the assumption that the correspondence points could be well-segmented from the captured image. Computer vision constitutes a huge problem and a lot of work is being done at all levels of human vision modelling, starting from a)image acquisition; b) further image enhancement, filtering and processing, c) image segmentation which involves thresholding, thinning, contour detection, texture and colour analysis, and so on. The interest of this thesis starts in the next step, usually known as depth perception or 3D measurement.
Resumo:
The dynamics of silence and remembrance in Australian writer Lily Brett’s autobiographic fiction Things Could Be Worse reflects the crisis of memory and understanding experienced by both first and second-generation Holocaust survivors within the diasporic space of contemporary Australia. It leads to issues of handling traumatic and transgenerational memory, the latter also known as postmemory (M. Hirsch), in the long aftermath of atrocities, and problematises the role of forgetting in shielding displaced identities against total dissolution of the self. This paper explores the mechanisms of remembrance and forgetting in L. Brett’s narrative by mainly focusing on two female characters, mother and daughter, whose coming to terms with (the necessary) silence, on the one hand, and articulated memories, on the other, reflects different modes of comprehending and eventually coping with individual trauma. By differentiating between several types of silence encountered in Brett’s prose (that of the voiceless victims, of survivors and their offspring, respectively), I argue that silence can equally voice and hush traumatic experience, that it is never empty, but invested with individual and collective meaning. Essentially, I contend that beside the (self-)damaging effects of silence, there are also beneficial consequences of it, in that it plays a crucial role in emplacing the displaced, rebuilding their shattered self, and contributing to their reintegration, survival and even partial healing.
Resumo:
El desarrollo eficiente y oportuno de las actividades propias de las empresas exige una constante renovación en su infraestructura, capacitación permanente de su staff, investigar nuevas tecnológicas y la asignación cada vez mayor del presupuesto para su área de TIC. Varios modelos de gestión han intentado suplir estas necesidades, entre los que se puede mencionar a: hosting, outsourcing, leasing, servicios profesionales, asesorías especializadas, entre otros. El modelo de gestión cloud computing y sus diversas opciones se está posicionando últimamente como la solución más viable y rápida de implementar. De ahí que, este proyecto se enfoca en el estudio de este modelo como una alternativa al modelo de gestión tradicional de servicios TIC, y toma como referencia para el desarrollo de esta tesis la situación actual de la infraestructura tecnológica de la Corporación ADC&HAS Management Ecuador S.A. No se pretende justificar al cloud como una solución definitiva, sino plantear este modelo como una alternativa útil a la realidad tecnológica de la Corporación, y en base a sus propiedades concluir que fue el modelo que mejor se ajustó a la estrategia institucional en términos: organizacionales, tecnológicos y financieros, por lo menos para los próximos cinco años. En los dos primeros capítulos se referencian algunos elementos conceptuales en los que se fundamenta las TIC y se mencionan ciertos parámetros que intervinieron en su evolución. El tercer capítulo describe a la Corporación; y en el capítulo cuarto se aplican los conceptos de los primeros capítulos reforzados con las experiencias publicadas en la revista Computerworld (2010 hasta la presente) y que permitieron evaluar los beneficios de los dos modelos de gestión y las razones para implementarlos o mantenerlos.
Resumo:
Hoy en día el uso de las TIC en una empresa se ha convertido en indispensable sin importar su tamaño o industria a tal punto que ya no es una ventaja competitiva el simple hecho de usarla sino que la clave es determinar cuál es la mejor para la empresa. Existen en el mercado servicios en-línea que igualan o sobrepasan los estándares de tecnologías pagadas que incluso deben ser instaladas dentro de la empresa consumiendo recursos y en algunos casos subutilizándolos, los servicios del cloud computing poseen la flexibilidad y capacidad que la empresa requiere de acuerdo a sus necesidades de crecimiento y de cambio continuo que el mercado exige en la actualidad. Algunas empresas no invierten en dichas tecnologías porque no saben que pueden accederlas a costos bajos e incluso en algunos casos sin pagar un solo centavo y pierden competitividad y productividad por simple desconocimiento. El uso de servicios SaaS e IaaS afecta al flujo de caja de manera positiva comparada con una inversión en infraestructura propia, sobre todo en empresas que están empezando a funcionar o en aquellas que están en proceso de renovación tecnológica. Después de realizada esta investigación se recomienda el uso de los servicios de cloud computing, pero es evidente que no son útiles para todas las empresas y la utilidad depende de la criticidad de las aplicaciones o infraestructuras, el momento por el que pasa la compañía, tamaño de la empresa, ¿Cloud pública o Cloud privada?, en ocasiones, virtualizar también es racionalizar y la visión integral de la seguridad. Por otro lado, después de tabular los datos obtenidos en la encuesta número 2 se evidenció que el lanzamiento del Servicio en la nube de EcuFlow incrementó la demanda de esta herramienta en el mercado.
Resumo:
This Policy Contribution assesses the broad obstacles hampering ICT-led growth in Europe and identifies the main areas in which policy could unlock the greatest value. We review estimates of the value that could be generated through take-up of various technologies and carry out a broad matching with policy areas. According to the literature survey and the collected estimates, the areas in which the right policies could unlock the greatest ICT-led growth are product and labour market regulations and the European Single Market. These areas should be reformed to make European markets more flexible and competitive. This would promote wider adoption of modern data-driven organisational and management practices thereby helping to close the productivity gap between the United States and the European Union. Gains could also be made in the areas of privacy, data security, intellectual property and liability pertaining to the digital economy, especially cloud computing, and next generation network infrastructure investment. Standardisation and spectrum allocation issues are found to be important, though to a lesser degree. Strong complementarities between the analysed technologies suggest, however, that policymakers need to deal with all of the identified obstacles in order to fully realise the potential of ICT to spur long-term growth beyond the partial gains that we report.
Resumo:
Spain, needing a bailout for its banks, was granted a vague promise by EZ leaders for up to €100 billion. The details remain obscure, yet they matter enormously. This column argues that the so-called ‘subordination effect’ of fresh official lending could put Spain on the slippery road to ruin. It argues that if sovereign bonds must be bought, this should be done in the secondary market which, would be on an equal footing with private investors and thus avoid the subordination trap.