721 resultados para cloud computing, software as a service, SaaS, enterprise systems, IS success
Resumo:
We consider the problem of optimizing the workforce of a service system. Adapting the staffing levels in such systems is non-trivial due to large variations in workload and the large number of system parameters do not allow for a brute force search. Further, because these parameters change on a weekly basis, the optimization should not take longer than a few hours. Our aim is to find the optimum staffing levels from a discrete high-dimensional parameter set, that minimizes the long run average of the single-stage cost function, while adhering to the constraints relating to queue stability and service-level agreement (SLA) compliance. The single-stage cost function balances the conflicting objectives of utilizing workers better and attaining the target SLAs. We formulate this problem as a constrained parameterized Markov cost process parameterized by the (discrete) staffing levels. We propose novel simultaneous perturbation stochastic approximation (SPSA)-based algorithms for solving the above problem. The algorithms include both first-order as well as second-order methods and incorporate SPSA-based gradient/Hessian estimates for primal descent, while performing dual ascent for the Lagrange multipliers. Both algorithms are online and update the staffing levels in an incremental fashion. Further, they involve a certain generalized smooth projection operator, which is essential to project the continuous-valued worker parameter tuned by our algorithms onto the discrete set. The smoothness is necessary to ensure that the underlying transition dynamics of the constrained Markov cost process is itself smooth (as a function of the continuous-valued parameter): a critical requirement to prove the convergence of both algorithms. We validate our algorithms via performance simulations based on data from five real-life service systems. For the sake of comparison, we also implement a scatter search based algorithm using state-of-the-art optimization tool-kit OptQuest. From the experiments, we observe that both our algorithms converge empirically and consistently outperform OptQuest in most of the settings considered. This finding coupled with the computational advantage of our algorithms make them amenable for adaptive labor staffing in real-life service systems.
Resumo:
Over the past six years Lowestoft College has embraced the revolution in mobile learning by welcoming Web 2.0, social media, cloud computing and Bring Your Own Device (BYOD). This open attitude to new technologies has led to a marked improvement in student achievement rates, has increased staff and student satisfaction and has resulted in a variety of cost savings for senior management during the current economic downturn.
Resumo:
En este proyecto se desarrolla un sistema capaz de garantizar la seguridad en un hogar o establecimiento, detectando cualquier acceso no deseado con sensores. También dispone de detectores de humo y otros gases. Como sistema disuasorio, cuenta con simulación de presencia para evitar intrusiones, por lo que también permite el control de luces y otros electrodomésticos. Todo el sistema se controla desde una aplicación en Android.
Resumo:
The relentlessly increasing demand for network bandwidth, driven primarily by Internet-based services such as mobile computing, cloud storage and video-on-demand, calls for more efficient utilization of the available communication spectrum, as that afforded by the resurging DSP-powered coherent optical communications. Encoding information in the phase of the optical carrier, using multilevel phase modulationformats, and employing coherent detection at the receiver allows for enhanced spectral efficiency and thus enables increased network capacity. The distributed feedback semiconductor laser (DFB) has served as the near exclusive light source powering the fiber optic, long-haul network for over 30 years. The transition to coherent communication systems is pushing the DFB laser to the limits of its abilities. This is due to its limited temporal coherence that directly translates into the number of different phases that can be imparted to a single optical pulse and thus to the data capacity. Temporal coherence, most commonly quantified in the spectral linewidth Δν, is limited by phase noise, result of quantum-mandated spontaneous emission of photons due to random recombination of carriers in the active region of the laser.
In this work we develop a generically new type of semiconductor laser with the requisite coherence properties. We demonstrate electrically driven lasers characterized by a quantum noise-limited spectral linewidth as low as 18 kHz. This narrow linewidth is result of a fundamentally new laser design philosophy that separates the functions of photon generation and storage and is enabled by a hybrid Si/III-V integration platform. Photons generated in the active region of the III-V material are readily stored away in the low loss Si that hosts the bulk of the laser field, thereby enabling high-Q photon storage. The storage of a large number of coherent quanta acts as an optical flywheel, which by its inertia reduces the effect of the spontaneous emission-mandated phase perturbations on the laser field, while the enhanced photon lifetime effectively reduces the emission rate of incoherent quanta into the lasing mode. Narrow linewidths are obtained over a wavelength bandwidth spanning the entire optical communication C-band (1530-1575nm) at only a fraction of the input power required by conventional DFB lasers. The results presented in this thesis hold great promise for the large scale integration of lithographically tuned, high-coherence laser arrays for use in coherent communications, that will enable Tb/s-scale data capacities.
Resumo:
Diferentes organizações públicas e privadas coletam e disponibilizam uma massa de dados sobre a realidade sócio-econômica das diferentes nações. Há hoje, da parte do governo brasileiro, um interesse manifesto de divulgar uma gama diferenciada de informações para os mais diversos perfis de usuários. Persiste, contudo, uma série de limitações para uma divulgação mais massiva e democrática, entre elas, a heterogeneidade das fontes de dados, sua dispersão e formato de apresentação pouco amigável. Devido à complexidade inerente à informação geográfica envolvida, que produz incompatibilidade em vários níveis, o intercâmbio de dados em sistemas de informação geográfica não é problema trivial. Para aplicações desenvolvidas para a Web, uma solução são os Web Services que permitem que novas aplicações possam interagir com aquelas que já existem e que sistemas desenvolvidos em plataformas diferentes sejam compatíveis. Neste sentido, o objetivo do trabalho é mostrar as possibilidades de construção de portais usando software livre, a tecnologia dos Web Services e os padrões do Open Geospatial Consortium (OGC) para a disseminação de dados espaciais. Visando avaliar e testar as tecnologias selecionadas e comprovar sua efetividade foi desenvolvido um exemplo de portal de dados sócio-econômicos, compreendendo informações de um servidor local e de servidores remotos. As contribuições do trabalho são a disponibilização de mapas dinâmicos, a geração de mapas através da composição de mapas disponibilizados em servidores remotos e local e o uso do padrão OGC WMC. Analisando o protótipo de portal construído, verifica-se, contudo, que a localização e requisição de Web Services não são tarefas fáceis para um usuário típico da Internet. Nesta direção, os trabalhos futuros no domínio dos portais de informação geográfica poderiam adotar a tecnologia Representational State Transfer (REST).
Resumo:
150 p.
Resumo:
A tecnologia da informação (TI) transformou o mundo nas últimas décadas e suas contribuições no processamento e disseminação da informação provocaram mudanças radicais no modo de viver das pessoas e afetou profundamente a gestão e a estrutura das organizações. Neste novo cenário, a literatura de negócios vem apresentando registros de desperdício de investimentos na área de tecnologia da informação. As dificuldades em reconhecer valor nos investimentos em TI e os desafios enfrentados na gestão de equipes de desenvolvimento de software são aspectos considerados no problema a ser pesquisado, a saber: a área de TI carece de sistemas de controle gerencial que poderiam minimizar riscos de desperdício de recursos, de baixa produtividade ou de fracasso na implantação do sistema a ser desenvolvido?O objetivo geral desta pesquisa é investigar as dificuldades enfrentadas pelas equipes de desenvolvimento de software para avaliar se há ou não desperdício de recursos, segundo a percepção dos entrevistados e, ainda, se há ou não carência de controle gerencial. Investigar quais poderiam ser os indicadores mais adequados para um sistema de controle gerencial voltados para esta área de atuação da TI e mapear fatores de sucesso estão entre os objetivos específicos da pesquisa. Do ponto de vista metodológico, esta é uma pesquisa exploratória que adota o procedimento de estudo de caso baseado em análise qualitativa. Os resultados alcançados confirmam o problema de desperdício de recursos e de baixa produtividade nas equipes de TI da empresa analisada e apontam para a carência de mecanismos ou processos de controle gerencial.
Resumo:
In the multi-site manufacturing domain, systems-of-systems (SoS) are rarely called so. However, there exist a number of collaborative manufacturing paradigms which closely relate to system-of-system principles. These include distributed manufacturing, dispersed network manufacturing, virtual enterprises and cloud manufacturing/manufacturing-as-a-service. This paper provides an overview of these terms and paradigms, exploring their characteristics, overlaps and differences. These manufacturing paradigms are then considered in relation to five key system-of-systems characteristics: autonomy, belonging, connectivity, diversity and emergence. Data collected from two surveys of academic and industry experts is presented and discussed, with key challenges and barriers to multi-site manufacturing SoS identified.
Resumo:
This paper presents a novel robot named "TUT03-A" with expert systems, speech interaction, vision systems etc. based on remote-brained approach. The robot is designed to have the brain and body separated. There is a cerebellum in the body. The brain with the expert systems is in charge of decision and the cerebellum control motion of the body. The brain-body. interface has many kinds of structure. It enables a brain to control one or more cerebellums. The brain controls all modules in the system and coordinates their work. The framework of the robot allows us to carry out different kinds of robotics research in an environment that can be shared and inherited over generations. Then we discuss the path planning method for the robot based on ant colony algorithm. The mathematical model is established and the algorithm is achieved with the Starlogo simulating environment. The simulation result shows that it has strong robustness and eligible pathfinding efficiency.
Resumo:
本文重点对超图划分和空间填充曲线两类算法进行比较研究。在大规模科学计算的中,并行计算效率提升的一个关键在于将数据进行剖分,分配到相应处理器中,以及对处理器中的数据进行动态调整,数据剖分和数据调整是实现处理器节点之间负载均衡的关键。针对数据剖分和数据调整问题,目前主要通过两类手段解决,分别称为拓扑手段和几何手段。而超图划分和空间填充曲线作为这两类手段的代表,在数据剖分和数据调整过程中得到了广泛的应用。 解决超图划分问题的经典算法中,应用最为广泛是启发式算法,如FM算法等。本文以FM算法为例,给出了较为详细的分析。随着问题规模的不断扩大,这些传统的算法消耗的时间急剧增加,研究者们因此又提出了对超图进行多级划分的算法框架。本文将对这一算法框架的各阶段细节进行分析。 空间填充曲线可以对离散的多维空间进行线性遍历,将多维的问题转化为一维的问题。利用这种特性,在数据剖分过程中可以将数据进行排序,根据这一顺序对数据进行剖分。本文对空间填充曲线的生成和应用,以及几类常用的空间填充曲线的顺序编码生成算法进行分析。 超图划分和空间填充曲线在数据剖分应用中各有优缺点。超图划分对节点间通信量等优化目标可以进行更为准确的计算,可以得到对更为有效的减少节点间通信量的数据剖分,但是超图划分这一过程本身所需要的时间较多;而空间填充曲线可以在很短的时间内对数据进行剖分,但是无法对优化目标进行准确计算。我们对两者在数据剖分中的应用,以及应用不同的划分模型对整体计算的影响进行了分析比较,并进行了实验对观点进行了验证。 在文章的最后,结合实际项目对数值软件可视化界面的设计进行了阐述。
Resumo:
为满足海量数据的处理需求,业界提出了多种解决方案。云计算是目前较为热门的一种,它主要用廉价PC组成超大规模集群服务器来进行数据存储和处理。随着云计算技术的发展,越来越多的应用将转移到云中,数据库系统也不例外。但数据库系统要求的ACID特性在数据分布存储时可能导致部分操作性能低下,如连接查询操作。为在数据分布存储下提高数据库系统的性能,提出了一种面向查询的数据分布策略(Selection Oriented Distribution,SOD),即根据数据库的查询情况确定数据的分布算法。该算法适用于云计算,能明显提高系统的查询性能。
Resumo:
Today, because of high petroleum consumption of our country, society steady development and difficulty increase in new resources exploration, deep exploitation of the existing oilfield is needed. More delicate reservoir imaging and description, such as thin layer identification, interlayer exploitation monitoring, subtle structure imaging, reservoir anisotropy recognition, can provide more detail evidence for new development adjustment scheme and enhanced oil recovery. Now, the people have already realized the 3D VSP technique more effective than the general methods in solving these aspects. But VSP technique especially 3D VSP develop slowly due to some reasons. Carrying out the research of VSP technique, it will be very useful to the EOR service. 3D VSP techniques include acquisition、data processing and interpretation. In this paper, the author carried out some researches around acquisition and processing. The key point of acquisition is the survey design, it is critical to the quality of the data and it will influence the reservoir recognition as follows. The author did detailed researches on the layout pattern of shot point and geophone. Some attributes relate to survey design such as reflectivity, incidence angle, observation area, reflection points distribution, fold, minimum well source distance, azimuth angle and so on are studied seriously. In this geometry design of 3D-VSP exploration in deviated wells, the main problems to be solved are: determining the center position of shots distribution, the effect of shots missing on coverage areas and coverage times,locating the shots and receivers of multi-wells. Through simulating and analyzing, the above problems are discussed and some beneficial conclusions are drawn. These will provide valuable references to actual survey design. In data processing, researches emphasize on those relatively key techniques such as wavefield separation, VSP-CDP imaging, the author carried out deep researches around these two aspects. As a result, variant apparent slowness wavefield separation method developed in this article suit the underground variant velocity field and make wavefield separation well, it can overcome reflection bending shortage aroused by conventional imaging method. The attenuateion range of underground seismic wave is very important for amplitude compensation and oil/gas identification.In this paper, seismic wave attenuateion mechanism is studied by 3D-VSP simulateion and Q-inversion technique. By testing with seismic data, the method of VSP data attenuateion and relationship of attenuateion attribute variant with depth is researched. Also the software of survey design and data processing is developed, it fill the gap of VSP area in our country. The technique developed applied successfully in SZXX-A Oilfield、QKYY-B Oilfield、A area and B area. The good results show that this research is valuable, and it is meaningful to the VSP technique development and application of offshore oil industry and other areas in our country.
Resumo:
As a management tool Similation Software deserves greater analysis from both an academic and industrial viewpoint. A comparative study of three packages was carried out from a 'first time' use approach. This allowed the ease of use and package features to be assessed using a simple theoretical benchmark manufacturing process. To back the use of these packages an objective survey on simulation use and package features was carried out within the manufacturing industry.This identified the use of simulation software, its' applicability and preception of user requirements thereby proposing an ideal package.
Resumo:
Urquhart, C., Spink, S., Thomas, R., Yeoman, A., Durbin, J., Turner, J., Fenton, R. & Armstrong, C. (2004). JUSTEIS: JISC Usage Surveys: Trends in Electronic Information Services Final report 2003/2004 Cycle Five. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: JISC