875 resultados para big data storage


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Questa tesi concerne quella che è una generalizzata tendenza verso la trasformazione digitale dei processi di business. Questa evoluzione, che implica l’utilizzo delle moderne tecnologie informatiche tra cui il Cloud Computing, le Big Data Analytics e gli strumenti Mobile, non è priva di insidie che vanno di volta in volta individuate ed affrontate opportunamente. In particolare si farà riferimento ad un caso aziendale, quello della nota azienda bolognese FAAC spa, ed alla funzione acquisti. Nell'ambito degli approvvigionamenti l'azienda sente la necessità di ristrutturare e digitalizzare il processo di richiesta di offerta (RdO) ai propri fornitori, al fine di consentire alla funzione di acquisti di concentrarsi sull'implementazione della strategia aziendale più che sull'operatività quotidiana. Si procede quindi in questo elaborato all'implementazione di un progetto di implementazione di una piattaforma specifica di e-procurement per la gestione delle RdO. Preliminarmente vengono analizzati alcuni esempi di project management presenti in letteratura e quindi viene definito un modello per la gestione del progetto specifico. Lo svolgimento comprende quindi: una fase di definizione degli obiettivi di continuità dell'azienda, un'analisi As-Is dei processi, la definizione degli obiettivi specifici di progetto e dei KPI di valutazione delle performance, la progettazione della piattaforma software ed infine alcune valutazioni relative ai rischi ed alle alternative dell'implementazione.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Neste breve artigo, procuro analisar um workshop de pesquisa sobre o tema das “Cidades Inteligentes”, ou smart cities, no qual estive presente. Nessa análise, mostro como os conceitos de “cidade inteligente” e “big data” são construídos de modo distinto pelos dois grupos de pessoas presentes no evento, que classifico como “otimizadores” e “reguladores”. Essas diferentes formas de se enxergar os dispositivos em questão levam a uma série de controvérsias. Em um primeiro momento, procuro enquadrar o modo como algumas das controvérsias aparecem dentro do marco teórico da Construção Social da Tecnologia (SCOT). Posteriormente, pretendo mostrar que as controvérsias que apareceram ao longo do evento não foram solucionadas – e dificilmente serão, num futuro próximo – enquanto não se optar por um modelo analítico tal como a Teoria Ator-Rede, que dá ouvidos para um grupo ignorado naquelas discussões: os dispositivos empregados na construção do conceito de “Cidade Inteligente”.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

O sociólogo que comanda um dos grandes centros de análise de big data no Brasil diz que os políticos só vão recuperar legitimidade quando aprenderem que "curtir" é coisa séria para detectar tendências e medir o pulso das aspirações sociais ganhou volume e tempo real no oceano de informações do big data, nome que se dá à gigantesca quantidade de dados produzidos diariamente na internet. É nessa mina inesgotável que o sociólogo carioca Marco Aurelio Ruediger, da Fundação Getulio Vargas, abastece a Diretoria de Análise de Políticas Públicas, um centro de estudo da visão que os brasileiros têm da máquina estatal e dos poderes da República.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

"This report summarizes the current status of the Long-Term Pavement Performance (LTPP) program and its major activities -- data collection, data storage, data analysis, and product development. It describes the work that will be needed beyond 2009 to realize the full potential of the world's most comprehensive pavement performance database and the benefits that will be accrued by capitalizing on the investment that has been made"--p. [2] of cover.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Grid computing is an advanced technique for collaboratively solving complicated scientific problems using geographically and organisational dispersed computational, data storage and other recourses. Application of grid computing could provide significant benefits to all aspects of power system that involves using computers. Based on our previous research, this paper presents a novel grid computing approach for probabilistic small signal stability (PSSS) analysis in electric power systems with uncertainties. A prototype computing grid is successfully implemented in our research lab to carry out PSSS analysis on two benchmark systems. Comparing to traditional computing techniques, the gird computing has given better performances for PSSS analysis in terms of computing capacity, speed, accuracy and stability. In addition, a computing grid framework for power system analysis has been proposed based on the recent study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Information systems have developed to the stage that there is plenty of data available in most organisations but there are still major problems in turning that data into information for management decision making. This thesis argues that the link between decision support information and transaction processing data should be through a common object model which reflects the real world of the organisation and encompasses the artefacts of the information system. The CORD (Collections, Objects, Roles and Domains) model is developed which is richer in appropriate modelling abstractions than current Object Models. A flexible Object Prototyping tool based on a Semantic Data Storage Manager has been developed which enables a variety of models to be stored and experimented with. A statistical summary table model COST (Collections of Objects Statistical Table) has been developed within CORD and is shown to be adequate to meet the modelling needs of Decision Support and Executive Information Systems. The COST model is supported by a statistical table creator and editor COSTed which is also built on top of the Object Prototyper and uses the CORD model to manage its metadata.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of the research project was to gain d complete and accurate accounting of the needs and deficiencies of materials selection and design data, with particular attention given to the feasibility of a computerised materials selection system that would include application analysis, property data and screening techniques. The project also investigates and integrates the three major aspects of materials resources, materials selection and materials recycling. Consideration of the materials resource base suggests that, though our discovery potential has increased, geologic availability is the ultimate determinant and several metals may well become scarce at the same time, thus compounding the problem of substitution. With around 2- to 20- million units of engineering materials data, the use of a computer is the only logical answer for scientific selection of materials. The system developed at Aston is used for data storage, mathematical computation and output. The system enables programs to be run in batch and interactive (on-line) mode. The program with modification can also handle such variables as quantity of mineral resources, energy cost of materials and depletion and utilisation rates of strateqic materials. The work also carries out an in-depth study of copper recycling in the U.K. and concludes that, somewhere in the region of 2 million tonnes of copper is missing from the recycling cycle. It also sets out guidelines on product design and conservation policies from the recyclability point of view.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A survey of the existing state-of-the-art of turbine blade manufacture highlights two operations that have not been automated namely that of loading of a turbine blade into an encapsulation die, and that of removing a machined blade from the encapsulation block. The automation of blade decapsulation has not been pursued. In order to develop a system to automate the loading of an encapsulation die a prototype mechanical handling robot has been designed together with a computer controlled encapsulation die. The robot has been designed as a mechanical handling robot of cylindrical geometry, suitable for use in a circular work cell. It is the prototype for a production model to be called `The Cybermate'. The prototype robot is mechanically complete but due to unforeseen circumstances the robot control system is not available (the development of the control system did not form a part of this project), hence it has not been possible to fully test and assess the robot mechanical design. Robot loading of the encapsulation die has thus been simulated. The research work with regard to the encapsulation die has focused on the development of computer controlled, hydraulically actuated, location pins. Such pins compensate for the inherent positional inaccuracy of the loading robot and reproduce the dexterity of the human operator. Each pin comprises a miniature hydraulic cylinder, controlled by a standard bidirectional flow control valve. The precision positional control is obtained through pulsing of the valves under software control, with positional feedback from an 8-bit transducer. A test-rig comprising one hydraulic location pin together with an opposing spring loaded pin has demonstrated that such a pin arrangement can be controlled with a repeatability of +/-.00045'. In addition this test-rig has demonstrated that such a pin arrangement can be used to gauge and compensate for the dimensional error of the component held between the pins, by offsetting the pin datum positions to allow for the component error. A gauging repeatability of +/- 0.00015' was demonstrated. This work has led to the design and manufacture of an encapsulation die comprising ten such pins and the associated computer software. All aspects of the control software except blade gauging and positional data storage have been demonstrated. Work is now required to achieve the accuracy of control demonstrated by the single pin test-rig, with each of the ten pins in the encapsulation die. This would allow trials of the complete loading cycle to take place.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We consider a variation of the prototype combinatorial optimization problem known as graph colouring. Our optimization goal is to colour the vertices of a graph with a fixed number of colours, in a way to maximize the number of different colours present in the set of nearest neighbours of each given vertex. This problem, which we pictorially call palette-colouring, has been recently addressed as a basic example of a problem arising in the context of distributed data storage. Even though it has not been proved to be NP-complete, random search algorithms find the problem hard to solve. Heuristics based on a naive belief propagation algorithm are observed to work quite well in certain conditions. In this paper, we build upon the mentioned result, working out the correct belief propagation algorithm, which needs to take into account the many-body nature of the constraints present in this problem. This method improves the naive belief propagation approach at the cost of increased computational effort. We also investigate the emergence of a satisfiable-to-unsatisfiable 'phase transition' as a function of the vertex mean degree, for different ensembles of sparse random graphs in the large size ('thermodynamic') limit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mode-locked lasers emitting a train of femtosecond pulses called dissipative solitons are an enabling technology for metrology, high-resolution spectroscopy, fibre optic communications, nano-optics and many other fields of science and applications. Recently, the vector nature of dissipative solitons has been exploited to demonstrate mode locked lasing with both locked and rapidly evolving states of polarisation. Here, for an erbium-doped fibre laser mode locked with carbon nanotubes, we demonstrate the first experimental and theoretical evidence of a new class of slowly evolving vector solitons characterized by a double-scroll chaotic polarisation attractor substantially different from Lorenz, Rössler and Ikeda strange attractors. The underlying physics comprises a long time scale coherent coupling of two polarisation modes. The observed phenomena, apart from the fundamental interest, provide a base for advances in secure communications, trapping and manipulation of atoms and nanoparticles, control of magnetisation in data storage devices and many other areas. © 2014 CIOMP. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

GraphChi is the first reported disk-based graph engine that can handle billion-scale graphs on a single PC efficiently. GraphChi is able to execute several advanced data mining, graph mining and machine learning algorithms on very large graphs. With the novel technique of parallel sliding windows (PSW) to load subgraph from disk to memory for vertices and edges updating, it can achieve data processing performance close to and even better than those of mainstream distributed graph engines. GraphChi mentioned that its memory is not effectively utilized with large dataset, which leads to suboptimal computation performances. In this paper we are motivated by the concepts of 'pin ' from TurboGraph and 'ghost' from GraphLab to propose a new memory utilization mode for GraphChi, which is called Part-in-memory mode, to improve the GraphChi algorithm performance. The main idea is to pin a fixed part of data inside the memory during the whole computing process. Part-in-memory mode is successfully implemented with only about 40 additional lines of code to the original GraphChi engine. Extensive experiments are performed with large real datasets (including Twitter graph with 1.4 billion edges). The preliminary results show that Part-in-memory mode memory management approach effectively reduces the GraphChi running time by up to 60% in PageRank algorithm. Interestingly it is found that a larger portion of data pinned in memory does not always lead to better performance in the case that the whole dataset cannot be fitted in memory. There exists an optimal portion of data which should be kept in the memory to achieve the best computational performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report on a new vector model of an erbium-doped fibre laser mode locked with carbon nanotubes. This model goes beyond the limitations of the previously used models based on either coupled nonlinear Schrödinger or Ginzburg-Landau equations. Unlike the previous models, it accounts for the vector nature of the interaction between an optical field and an erbium-doped active medium, slow relaxation dynamics of erbium ions, linear birefringence in a fibre, linear and circular birefringence of a laser cavity caused by in-cavity polarization controller and light-induced anisotropy caused by elliptically polarized pump field. Interplay of aforementioned factors changes coherent coupling of two polarization modes at a long time scale and so results in a new family of vector solitons (VSs) with fast and slowly evolving states of polarization. The observed VSs can be of interest in secure communications, trapping and manipulation of atoms and nanoparticles, control of magnetization in data storage devices and many other areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Multiple transformative forces target marketing, many of which derive from new technologies that allow us to sample thinking in real time (i.e., brain imaging), or to look at large aggregations of decisions (i.e., big data). There has been an inclination to refer to the intersection of these technologies with the general topic of marketing as “neuromarketing”. There has not been a serious effort to frame neuromarketing, which is the goal of this paper. Neuromarketing can be compared to neuroeconomics, wherein neuroeconomics is generally focused on how individuals make “choices”, and represent distributions of choices. Neuromarketing, in contrast, focuses on how a distribution of choices can be shifted or “influenced”, which can occur at multiple “scales” of behavior (e.g., individual, group, or market/society). Given influence can affect choice through many cognitive modalities, and not just that of valuation of choice options, a science of influence also implies a need to develop a model of cognitive function integrating attention, memory, and reward/aversion function. The paper concludes with a brief description of three domains of neuromarketing application for studying influence, and their caveats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper researches on Matthew Effect in Sina Weibo microblogger. We choose the microblogs in the ranking list of Hot Microblog App in Sina Weibo microblogger as target of our study. The differences of repost number of microblogs in the ranking list between before and after the time when it enter the ranking list of Hot Microblog app are analyzed. And we compare the spread features of the microblogs in the ranking list with those hot microblogs not in the list and those ordinary microblogs of users who have some microblog in the ranking list before. Our study proves the existence of Matthew Effect in social network. © 2013 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The miniaturization, sophistication, proliferation, and accessibility of technologies are enabling the capture of more and previously inaccessible phenomena in Parkinson's disease (PD). However, more information has not translated into a greater understanding of disease complexity to satisfy diagnostic and therapeutic needs. Challenges include noncompatible technology platforms, the need for wide-scale and long-term deployment of sensor technology (among vulnerable elderly patients in particular), and the gap between the "big data" acquired with sensitive measurement technologies and their limited clinical application. Major opportunities could be realized if new technologies are developed as part of open-source and/or open-hardware platforms that enable multichannel data capture sensitive to the broad range of motor and nonmotor problems that characterize PD and are adaptable into self-adjusting, individualized treatment delivery systems. The International Parkinson and Movement Disorders Society Task Force on Technology is entrusted to convene engineers, clinicians, researchers, and patients to promote the development of integrated measurement and closed-loop therapeutic systems with high patient adherence that also serve to (1) encourage the adoption of clinico-pathophysiologic phenotyping and early detection of critical disease milestones, (2) enhance the tailoring of symptomatic therapy, (3) improve subgroup targeting of patients for future testing of disease-modifying treatments, and (4) identify objective biomarkers to improve the longitudinal tracking of impairments in clinical care and research. This article summarizes the work carried out by the task force toward identifying challenges and opportunities in the development of technologies with potential for improving the clinical management and the quality of life of individuals with PD. © 2016 International Parkinson and Movement Disorder Society.