864 resultados para Monitoring, SLA, JBoss, Middleware, J2EE, Java, Service Level Agreements
Resumo:
Pós-graduação em Bases Gerais da Cirurgia - FMB
Resumo:
No presente trabalho, são discutidas questões relacionadas à verificação e à análise do nível de serviço logístico, prestado pelo Estado, na execução de perícias criminais de engenharia civil. Foram considerados fatores como equipamentos e meios de transporte utilizados, qualificação dos profissionais envolvidos, padronização de procedimentos adotados e a emissão de laudos periciais. O objetivo é a obtenção de diretrizes na atividade estudada, através da identificação das possíveis oportunidades de melhoria existentes na gestão desta área da Criminalística, considerando-se os componentes de desempenho logísticos relacionados aos fatores chaves: estoque, transporte, instalações e informação. A estratégia de pesquisa utilizada foi o estudo de caso, com o emprego de relatórios estatísticos e entrevistas semi-estruturadas aos gestores do órgão responsável pela atividade pericial no Pará. Quanto aos resultados obtidos, ao se analisar o conteúdo das entrevistas realizadas, observou-se que as hipóteses de trabalho apresentavam correlação com algumas das diretrizes logísticas elaboradas, tais como o aumento na eficiência do nível de serviço logístico na atividade estudada através da adoção de procedimentos operacionais padronizados.
Resumo:
Increasingly competitive markets have driven the search for companies in many different ways to win and keep customers. The service level is basically the performance of companies in fulfilling the orders made, or how companies demonstrate to their clients efforts in their behalf. This work aims to solve the difficulties faced by a multinational company present in Brazil, in the distribution of its products in the category Ice Cream in order to improve the service level of their customers. Review the logistics network and concepts related to the distribution system of products is one of several ways to achieve this goal, as well as the use of IT and tools to assist in planning and programming of the physical distribution of products. In this study we used the concept of direct distribution system called Transit Point (TP). The TP provides at the same time, a strategy of rapid response, flexibility, low transportation costs and no inventory. A router - software capable of simulating the actual conditions experienced in the daily distribution - was used to assist in calculations. Results showed reductions of up to 47.5% in transportation costs and better conditions were provided in the distribution of products, positively impacting on service levels and in the maintenance of products quality, with a reduction of 1.6% of the total costs involve
Resumo:
Pós-graduação em Engenharia Mecânica - FEG
Resumo:
The study presents the application of an ergonomic method based on graphical mesh, created to analyze the attributes of public open space and that influence on their usability. It was based on DePAN method, created to classify the service level offered by niches present in open relationship spaces. However, this method adopted Geographical Information System (GIS ) to make maps that represent the space attributes. This study proposed an adaptation of this method, which replaces GIS by AutoCAD software to generate maps . The goal is to test the feasibility and efficiency of producing maps in AutoCAD software to represent the service niches level. An relashionship area on a college campus for data collecting in graphical meshes and maps confection. By applying the graphical mesh the attributes of that space were evaluated to help promote people permanence on site. Although the process through AutoCAD seems to be harder by being mechanical it guarantees a satisfactory result for analysis, resembling images generated by GIS. Therefore, it is concluded that it’s possible to represent graphically the riches service level.
Resumo:
Pós-graduação em Engenharia de Produção - FEG
Resumo:
Human biomonitoring (HBM) is an ideal tool for evaluating toxicant exposure in health risk assessment. Chemical substances or their metabolites related to environmental pollutants can be detected as biomarkers of exposure using a wide variety of biological fluids. Individual exposure to aromatic hydrocarbon compounds (benzene, toluene, and o-xylene –“BTX”) were analysed with a liquid chromatography coupled to electrospray ionisation-mass spectrometry (μHPLC-ESI-MS/MS) method for the simultaneous quantitative detection of the BTX exposure biomarker SPMA, SBMA and o-MBMA in human urine. Urinary S-phenylmercapturic acid (SPMA) is a biomarker proposed by the American Conference of Governmental Industrial Hygienists (ACGIH) for assessing occupational exposure to benzene (Biological Exposure Index of 25 microg/g creatinine). Urinary S-benzylmercapturic (SBMA) and o-methyl S-benzyl mercapturic acid (o-MBMA) are specific toluene and o-xylene metabolites of glutathione detoxicant pathways, proposed as reliable biomarkers of exposure. To this aim a pre-treatment of the urine with solid phase extraction (SPE) and an evaporation step were necessary to concentrate the mercapturic acids before instrumental analysis. A liquid chromatography separation was carried out with a reversed phase capillary column (Synergi 4u Max-RP) using a binary gradient composed of an acquous solution of formic acid 0.07% v/v and methanol. The mercapturic acids were determinated by negative-ion-mass spectrometry and the data were corrected using isotope-labelled analogs as internal standards. The analytical method follows U.S. Food and Drug Administration guidance and was applied to assess exposure to BTX in a group of 396 traffic wardens. The association between biomarker results and individual factors, such as age, sex and tobacco smoke were also investigated. The present work also included improvements in the methods used by modifying various chromatographic parameters and experimental procedures. A partial validation was conducted to evaluate LOD, precision, accuracy, recovery as well as matrix effects. Higher sensitivity will be possible in future biological monitoring programmes, allowing evaluation of very low level of BTX human exposure. Keywords: Human biomonitoring, aromatic hydrocarbons, biomarker of exposure, HPLC-MS/MS.
Resumo:
Nel corso di questa tesi analizzeremo che cos'è il cloud computing, illustrando i contratti di service level agreement e le soluzioni presenti nel mercato.
Resumo:
TuCSoN (Tuple Centres Spread over the Network) è un modello di coordinazione per processi distribuiti o agenti autonomi. Il modello di TuCSoN viene implementato come un middleware distribuito Java-based, distribuito Open Source sotto la licenza LGPL tramite Googlecode. Il fatto che lo stesso sia Open Source e Java-based ha reso possibile il suo porting su Android, rendendo il noto sistema operativo di Google un possibile agente partecipante ad un sistema TuCSoN. La tesi descrive il percorso che ha portato dallo studio dell'infrastruttura TuCSoN e del sistema Android alla realizzazione dell'applicazione Android, rendendo possibile a qualsiasi dispositivo Android di partecipare ad un sistema TuCSoN. Nel particolare l'obiettivo finale dell'applicazione Android, e di questa tesi, è rendere lo smartphone un nodo TuCSoN funzionante. La tesi non si pone l'obiettivo di analizzare ed esplorare le funzionalità e le possibilitàa delle due tecnologie principali trattate (Android e TuCSoN) nel loro singolo, quanto quello di esplorare le criticità che un porting di questo tipo comporta, quali ad esempio le differenze intrinseche fra la JVM e la DalvikVM e come aggirarle, o le funzionalità di Android e come utilizzarle allo scopo di realizzare un applicazione che funga da server ad una infra- struttura distribuita, oppure le differenze a livello di gestione della GUI fra Android e plain-java, e di analizzare le soluzioni trovate per risolvere (o dove non era possibile risolvere evitare) tali problemi al fine del raggiungimento dell'obiettivo che ci si era prefissati.
Resumo:
Self-organising pervasive ecosystems of devices are set to become a major vehicle for delivering infrastructure and end-user services. The inherent complexity of such systems poses new challenges to those who want to dominate it by applying the principles of engineering. The recent growth in number and distribution of devices with decent computational and communicational abilities, that suddenly accelerated with the massive diffusion of smartphones and tablets, is delivering a world with a much higher density of devices in space. Also, communication technologies seem to be focussing on short-range device-to-device (P2P) interactions, with technologies such as Bluetooth and Near-Field Communication gaining greater adoption. Locality and situatedness become key to providing the best possible experience to users, and the classic model of a centralised, enormously powerful server gathering and processing data becomes less and less efficient with device density. Accomplishing complex global tasks without a centralised controller responsible of aggregating data, however, is a challenging task. In particular, there is a local-to-global issue that makes the application of engineering principles challenging at least: designing device-local programs that, through interaction, guarantee a certain global service level. In this thesis, we first analyse the state of the art in coordination systems, then motivate the work by describing the main issues of pre-existing tools and practices and identifying the improvements that would benefit the design of such complex software ecosystems. The contribution can be divided in three main branches. First, we introduce a novel simulation toolchain for pervasive ecosystems, designed for allowing good expressiveness still retaining high performance. Second, we leverage existing coordination models and patterns in order to create new spatial structures. Third, we introduce a novel language, based on the existing ``Field Calculus'' and integrated with the aforementioned toolchain, designed to be usable for practical aggregate programming.
Resumo:
Background During the Soviet era, malaria was close to eradication in Tajikistan. Since the early 1990s, the disease has been on the rise and has become endemic in large areas of southern and western Tajikistan. The standard national treatment for Plasmodium vivax is based on primaquine. This entails the risk of severe haemolysis for patients with glucose-6-phosphate dehydrogenase (G6PD) deficiency. Seasonal and geographical distribution patterns as well as G6PD deficiency frequency were analysed with a view to improve understanding of the current malaria situation in Tajikistan. Methods Spatial and seasonal distribution was analysed, applying a risk model that included key environmental factors such as temperature and the availability of mosquito breeding sites. The frequency of G6PD deficiency was studied at the health service level, including a cross-sectional sample of 382 adult men. Results Analysis revealed high rates of malaria transmission in most districts of the southern province of Khatlon, as well as in some zones in the northern province of Sughd. Three categories of risk areas were identified: (i) zones at relatively high malaria risk with high current incidence rates, where malaria control and prevention measures should be taken at all stages of the transmission cycle; (ii) zones at relatively high malaria risk with low current incidence rates, where malaria prevention measures are recommended; and (iii) zones at intermediate or low malaria risk with low current incidence rates where no particular measures appear necessary. The average prevalence of G6PD deficiency was 2.1% with apparent differences between ethnic groups and geographical regions. Conclusion The study clearly indicates that malaria is a serious health issue in specific regions of Tajikistan. Transmission is mainly determined by temperature. Consequently, locations at lower altitude are more malaria-prone. G6PD deficiency frequency is too moderate to require fundamental changes in standard national treatment of cases of P. vivax.
Resumo:
In this paper we introduce a cooperative environment between the Interactive Digital TV (IDTV) and home networking with the aim of allowing the interaction between interactive TV applications and the controllers of the in-home appliances in a natural way. More specifically, our proposal consists of merging MHP (Multimedia Home Platform), one of the main standard frameworks for IDTV, with OSGi (Open Service Gateway Initiative), the most widely used open platform to set up Residential Gateways. To overcome the radically different nature of these specifications the function-oriented MHP middleware and the service-oriented OSGi framework , we define a new kind of application, coined as XbundLET. Although this software bridge is suitable to enable the interaction between MHP and OSGi applications in both directions, we concretely focus on exposing our implementation experience in only one direction: from MHP to the OSGi world.
Resumo:
This paper presents a survey on the usage, opportunities and pitfalls of semantic technologies in the Internet of Things. The survey was conducted in the context of a semantic enterprise integration platform. In total we surveyed sixty-one individuals from industry and academia on their views and current usage of IoT technologies in general, and semantic technologies in particular. Our semantic enterprise integration platform aims for interoperability at a service level, as well as at a protocol level. Therefore, also questions regarding the use of application layer protocols, network layer protocols and management protocols were integrated into the survey. The survey suggests that there is still a lot of heterogeneity in IoT technologies, but first indications of the use of standardized protocols exist. Semantic technologies are being recognized as of potential use, mainly in the management of things and services. Nonetheless, the participants still see many obstacles which hinder the widespread use of semantic technologies: Firstly, a lack of training as traditional embedded programmers are not well aware of semantic technologies. Secondly, a lack of standardization in ontologies, which would enable interoperability and thirdly, a lack of good tooling support.
Resumo:
In a large health care system, the importance of accurate information as feedback mechanisms about its performance is necessary on many levels from the senior level management to service level managers for valid decision-making purposes. The implementation of dashboards is one way to remedy the problem of data overload by providing up-to-date, accurate, and concise information. As this health care system seeks to have an organized, systematic review mechanism in place, dashboards are being created in a variety of the hospital service departments to monitor performance indicators. The Infection Control Administration of this health care system is one that does not currently utilize a dashboard but seeks to implement one. ^ The purpose of this project is to research and design a clinical dashboard for the Infection Control Administration. The intent is that the implementation and usefulness of the clinical dashboard translates into improvement in the measurement of health care quality.^
Resumo:
Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.