970 resultados para Software Process


Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJETIVO: Analisar a qualidade dos serviços oferecidos por empresas operadoras de planos de saúde, segundo a percepção de usuários. MÉTODOS: Estudo transversal com 360 usuários de sete operadoras de planos de saúde da cidade de Curitiba, PR, e região metropolitana em 2008. Foi aplicado questionário sobre as preferências dos usuários em relação a seis atributos (localização dos pontos de atendimento; efetividade da ação dos médicos, clínicas e hospitais; rapidez e amabilidade no atendimento; facilidade na liberação de guias; preço; abrangência da rede credenciada) de cada uma das empresas operadoras. Para a análise das respostas foi utilizado o método Analytic Hierarchy Process (AHP, ou Processo Analítico de Hierarquia), ferramenta de análise de decisão e planejamento de múltiplos critérios. RESULTADOS: O atributo mais valorizado pelos usuários foi "preço". As empresas foram agrupadas em dois conjuntos de preferências em relação aos atributos: dos sete planos de saúde, dois apresentaram menor preferência (entre 23% e 19%) e cinco, maior preferência (em torno de 10%). CONCLUSÕES: Com esse tipo de pesquisa, as empresas operadoras de planos de saúde poderiam reformular suas estruturas, processos, preços e redes credenciadas com o objetivo de melhorar seu posicionamento no mercado.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

P-NET is a multi-master fieldbus standard based on a virtual token passing scheme. In P-NET each master is allowed to transmit only one message per token visit. In the worst-case, the communication response time can be derived considering that, in each token cycle, all stations use the token to transmit a message. In this paper, we define a more sophisticated P-NET model, which considers the actual token utilisation. We then analyse the possibility of implementing a local priority-based scheduling policy to improve the real-time behaviour of P-NET.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção de grau de Mestre em Engenharia Mecânica na Especialidade de Manutenção e Produção

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Devido ao acréscimo significativo de viaturas e peões nas grandes cidades foi necessário recorrer aos mecanismos existentes para coordenar o tráfego. Nesta perspectiva surge a implementação de semáforos com o objectivo de ordenar o tráfego nas vias rodoviárias. A gestão de tráfego, tem sido sujeita a inovações tanto ao nível dos equipamentos, do software usado, gestão centralizada, monitorização das vias e na sincronização semafórica, sendo possível a criação de programas ajustados às diferentes exigências de tráfego verificadas durante as vinte e quatro horas para pontos distintos da cidade. Conceptualmente foram elaborados estudos, com o objectivo de identificar a relação entre a velocidade o fluxo e o intervalo num determinado intervalo de tempo, bem como a relação entre a velocidade e a sinistralidade. Até 1995 Portugal era um dos países com maior número de sinistros rodoviários Na sequência desta evolução foram instalados radares de controlo de velocidade no final de 2006 com o objectivo de obrigar ao cumprimento dos limites de velocidade impostos pelo código da estrada e reduzir a sinistralidade automóvel na cidade de Lisboa. Passados alguns anos sobre o investimento realizadoanteriormente, constatamos que existe a necessidade de implementar novas tecnologias na detecção das infracções, sejam estas de excesso de velocidade ou violação do semáforo vermelho (VSV), optimizar a informação disponibilizada aos automobilistas e aos peões, coordenar a interacção entre os veículos prioritários e os restantes presentes na via, dinamizar a gestão interna das contra ordenações, agilizar os procedimentos informatizar a recolha deinformação de modo a tornar os processos mais céleres.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A partir da década de noventa do século passado, começaram a surgir no mercado ferramentas de cálculo com o objetivo de agilizar a conceção do projeto de engenharia da construção. Até ao final da década de setenta os computadores existentes eram enormes, apenas entidades de grande poder económico os podiam adquirir. Na década de oitenta surgiu no mercado o PC, Personal Computer, estas pequenas máquinas começaram a ser adquiridas pela generalidade das empresas e em Portugal no final desta década era possível encontrar indivíduos que já possuíam o seu PC. Na década de noventa, a saída de recém-formados das instituições de ensino superior, fomentou no mercado o aparecimento de empresas de informática dedicadas à conceção de software de acordo com as necessidades do próprio mercado, daí resultando software comercial à medida e software comercial de prateleira (COTS, Commercial Off-The-Shelf)). O software comercial, ao ser utilizado por um grande número de pessoas, atingindo facilmente, no caso do COTS, os milhares, tem condições para evoluir de acordo com as exigências sistemáticas do próprio mercado, atingindo elevados patamares no cumprimento de requisitos de qualidade, nomeadamente no que concerne à funcionalidade, fiabilidade, usabilidade, manutenibilidade, eficiência, portabilidade e qualidade na utilização. A utilização de software comercial na área do projeto de engenharia da construção é hoje em dia uma prática absolutamente generalizada. A seleção do software pode tornar-se um processo complexo especialmente naquelas áreas em que existe grande oferta. A utilização de critérios de avaliação bem definidos poderá agilizar o processo e dar maiores garantias no momento da decisão final. Neste documento apresenta-se uma proposta de metodologia para avaliação e comparação de softwares.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre Em Engenharia Química e Biológica Ramo de processos Químicos

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The principal topic of this work is the application of data mining techniques, in particular of machine learning, to the discovery of knowledge in a protein database. In the first chapter a general background is presented. Namely, in section 1.1 we overview the methodology of a Data Mining project and its main algorithms. In section 1.2 an introduction to the proteins and its supporting file formats is outlined. This chapter is concluded with section 1.3 which defines that main problem we pretend to address with this work: determine if an amino acid is exposed or buried in a protein, in a discrete way (i.e.: not continuous), for five exposition levels: 2%, 10%, 20%, 25% and 30%. In the second chapter, following closely the CRISP-DM methodology, whole the process of construction the database that supported this work is presented. Namely, it is described the process of loading data from the Protein Data Bank, DSSP and SCOP. Then an initial data exploration is performed and a simple prediction model (baseline) of the relative solvent accessibility of an amino acid is introduced. It is also introduced the Data Mining Table Creator, a program developed to produce the data mining tables required for this problem. In the third chapter the results obtained are analyzed with statistical significance tests. Initially the several used classifiers (Neural Networks, C5.0, CART and Chaid) are compared and it is concluded that C5.0 is the most suitable for the problem at stake. It is also compared the influence of parameters like the amino acid information level, the amino acid window size and the SCOP class type in the accuracy of the predictive models. The fourth chapter starts with a brief revision of the literature about amino acid relative solvent accessibility. Then, we overview the main results achieved and finally discuss about possible future work. The fifth and last chapter consists of appendices. Appendix A has the schema of the database that supported this thesis. Appendix B has a set of tables with additional information. Appendix C describes the software provided in the DVD accompanying this thesis that allows the reconstruction of the present work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The recent trends of chip architectures with higher number of heterogeneous cores, and non-uniform memory/non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as a fundamental building block for developing parallel applications. Nevertheless, although STM promises to ease concurrent and parallel software development, it relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by embedded real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upper-bounded and task sets can be feasibly scheduled. In this paper we assess the use of STM in the development of embedded real-time software, defending that the amount of contention can be reduced if read-only transactions access recent consistent data snapshots, progressing in a wait-free manner. We show how the required number of versions of a shared object can be calculated for a set of tasks. We also outline an algorithm to manage conflicts between update transactions that prevents starvation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The foreseen evolution of chip architectures to higher number of, heterogeneous, cores, with non-uniform memory and non-coherent caches, brings renewed attention to the use of Software Transactional Memory (STM) as an alternative to lock-based synchronisation. However, STM relies on the possibility of aborting conflicting transactions to maintain data consistency, which impacts on the responsiveness and timing guarantees required by real-time systems. In these systems, contention delays must be (efficiently) limited so that the response times of tasks executing transactions are upperbounded and task sets can be feasibly scheduled. In this paper we defend the role of the transaction contention manager to reduce the number of transaction retries and to help the real-time scheduler assuring schedulability. For such purpose, the contention management policy should be aware of on-line scheduling information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the TeleRisk Project on labour relations and professional risks within the context of teleworking in Portugal – supported by IDICT – Institute for Development and Inspection of Working Conditions (Ministry of Labour), is to study the practices and forms of teleworking in the manufacturing sectors in Portugal. The project chose also the software industry as a reference sector, even though it does not intend to exclude from the study any other sector of activity or the so-called “hybrid” forms of work. However, the latter must have some of the characteristics of telework. The project thus takes into account the so-called “traditional” sectors of activity, namely textile and machinery and metal engineering (machinery and equipment), not usually associated to this type of work. However, telework could include, in the so-called “traditional” sectors, other variations that are not found in technologically based sectors. One of the evaluation methods for the dynamics associated to telework consisted in carrying out surveys by means of questionnaires, aimed at employers in the sectors analysed. This paper presents some of the results of those surveys. It is important to mention that, being a preliminary analysis, it means that it does not pretend to have exhausted all the issues in the survey, but has meant that it shows the bigger tendencies, in terms of teleworking practices, of the Portuguese industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The foresight and scenario building methods can be an interesting reference for social sciences, especially in terms of innovative methods for labour process analysis. A scenario – as a central concept for the prospective analysis – can be considered as a rich and detailed portrait of a plausible future world. It can be a useful tool for policy-makers to grasp problems clearly and comprehensively, and to better pinpoint challenges as well as opportunities in an overall framework. The features of the foresight methods are being used in some labour policy making experiences. Case studies developed in Portugal will be presented, and some conclusions will be drawn in order to organise a set of principles for foresight analysis applied to the European project WORKS on the work organisation re-structuring in the knowledge society, and on the work design methods for new management structures of virtual organisations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Consider a multihop network comprising Ethernet switches. The traffic is described with flows and each flow is characterized by its source node, its destination node, its route and parameters in the generalized multiframe model. Output queues on Ethernet switches are scheduled by static-priority scheduling and tasks executing on the processor in an Ethernet switch are scheduled by stride scheduling. We present schedulability analysis for this setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Informática.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to evaluate the influence of the crushing process used to obtain recycled concrete aggregates on the performance of concrete made with those aggregates. Two crushing methods were considered: primary crushing, using a jaw crusher, and primary plus secondary crushing (PSC), using a jaw crusher followed by a hammer mill. Besides natural aggregates (NA), these two processes were also used to crush three types of concrete made in laboratory (L20, L45 e L65) and three more others from the precast industry (P20, P45 e P65). The coarse natural aggregates were totally replaced by coarse recycled concrete aggregates. The recycled aggregates concrete mixes were compared with reference concrete mixes made using only NA, and the following properties related to the mechanical and durability performance were tested: compressive strength; splitting tensile strength; modulus of elasticity; carbonation resistance; chloride penetration resistance; water absorption by capillarity; water absorption by immersion; and shrinkage. The results show that the PSC process leads to better performances, especially in the durability properties. © 2014 RILEM

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the current complexity of communication protocols, implementing its layers totally in the kernel of the operating system is too cumbersome, and it does not allow use of the capabilities only available in user space processes. However, building protocols as user space processes must not impair the responsiveness of the communication. Therefore, in this paper we present a layer of a communication protocol, which, due to its complexity, was implemented in a user space process. Lower layers of the protocol are, for responsiveness issues, implemented in the kernel. This protocol was developed to support large-scale power-line communication (PLC) with timing requirements.