824 resultados para decentralised data fusion framework


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Este trabalho procura examinar o que o leitor brasileiro contemporâneo lê, com o propósito de explicar as razões que levam esse leitor a realizar suas escolhas. Nesse sentido, portanto, o objetivo central do trabalho será examinar o perfil desse leitor brasileiro. O levantamento dos dados para estabelecer o corpus da pesquisa foi realizado por meio do registro das listas de livros mais vendidos, publicadas em dois jornais brasileiros. O primeiro jornal fonte da pesquisa foi o Leia, periódico mensal que circulou no território nacional durante o período de abril de 1978 a setembro de 1991. O segundo, foi o Jornal do Brasil, diário carioca que publicou listas dos livros mais vendidos no Brasil a partir de 1966 até o mês de dezembro de 2004, data de encerramento da pesquisa, em caderno destinado à leitura. Como o segundo jornal interrompeu a publicação das listas dos mais vendidos durante o período de fevereiro de 1976 a abril de 1984, propusemos uma fusão dos dados dos dois jornais de forma a cobrir um período que compreende os anos de 1966 até 2004. A base teórica a partir da qual se estabeleceu o exame do perfil do leitor brasileiro foi a semiótica da escola de Paris. Para o tratamento da questão da leitura elegeu-se o exame das manifestações da enunciação no discurso, as projeções do enunciador e do enunciatário e o tratamento das paixões. Foram observados em cada um dos textos do corpus como essas categorias enunciativas projetam-se em cada um dos textos mais lidos pelos leitores brasileiros e, posteriormente, como, nas listas dos livros mais vendidos, esse leitor manifesta-se como enunciador. Para tanto propôs-se a contraposição entre o ethos do enunciador-leitor das listas e o pathos do enunciatário dos discursos de leitura. Uma vez que o corpus da pesquisa revelou um crescimento na opção pelos textos de auto-ajuda, foi examinada a questão específica... (Resumo completo, clicar acesso eletrônico abaixo)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the industrial environment the challenge is use better the productive resources: people and machine. The following work has the main goal improve the efficient losses analysis in the stator bar’s production bottleneck equipment situated in the Electric generator’s factory. The action research involved Theory of Constraints on the restriction system identification and developed the data collection framework by losses typology for indicator measurement. The research showed the data collection standardization importance to obtain reliable data and strategic efficiency indicator to optimize equipments. Besides of this, OEE and TEEP indicator demonstrated efficiency results to analyze the actual efficiency when the machine works and the increase capacity opportunity to treat the hide costs in the organization following the continuous improvement

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer and telecommunication networks are changing the world dramatically and will continue to do so in the foreseeable future. The Internet, primarily based on packet switches, provides very flexible data services such as e-mail and access to the World Wide Web. The Internet is a variable-delay, variable- bandwidth network that provides no guarantee on quality of service (QoS) in its initial phase. New services are being added to the pure data delivery framework of yesterday. Such high demands on capacity could lead to a “bandwidth crunch” at the core wide-area network, resulting in degradation of service quality. Fortunately, technological innovations have emerged which can provide relief to the end user to overcome the Internet’s well-known delay and bandwidth limitations. At the physical layer, a major overhaul of existing networks has been envisaged from electronic media (e.g., twisted pair and cable) to optical fibers - in wide-area, metropolitan-area, and even local-area settings. In order to exploit the immense bandwidth potential of optical fiber, interesting multiplexing techniques have been developed over the years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mit zunehmender Komplexität und Vielfalt der Logistikprozesse steigt der Stellenwert der eingesetzten Informationstechnologien. Die den Warenfluss begleitenden bzw. vorhereilenden Informationen sind erforderlich, um Waren identifizieren und Unternehmensressourcen optimal einsetzen zu können. Als Beispiel ist der klassische Wareneingang zu nennen. Durch die Avisierung von Menge und Art eingehender Waren können der Einsatz des Personals zur Entladung und Vereinnahmung sowie die erforderlichen Ressourcen (Ladehilfsmittel, Flurförderzeuge, usw.) im Vorfeld geplant und bereitgestellt werden. Der Informationsfluss ist demnach als Qualitätsmerkmal und als Wirtschaftlichkeitsfaktor zu verstehen. Die Schnittstelle zwischen dem physischen Warenfluss und dem Informationsfluss auf EDV-Basis bildet die Identifikationstechnologien. In der Industrie verbreitete Identifikationstechnologien bestehen in der Regel aus einem Datenträger und einem Erfassungsgerät. Der Datenträger ist am physischen Objekt fixiert. Das Erfassungsgerät liest die auf dem Datenträger befindlichen Objektinformationen und wandelt sie in einen Binär-Code um, der durch nachgelagerte EDV weiterverarbeitet wird. Die momentan in der Industrie und im Handel am häufigsten verwendete Identifikationstechnologie ist der Barcode. In den letzten Jahren tritt die RFID-Technologie in den Fokus der Industrie und des Handels im Bereich Materialfluss und Logistik. Unter „Radio Frequency IDentification“ wird die Kommunikation per Funkwellen zwischen Datenträger (Transponder) und Lesegerät verstanden. Mittels der RFID-Technologie ist der Anwender, im Gegensatz zum Barcode, in der Lage, Informationen auf dem Transponder ohne Sichtkontakt zu erfassen. Eine Ausrichtung der einzelnen Artikel ist nicht erforderlich. Zudem können auf bestimmten Transpondertypen weitaus größere Datenmengen als auf einem Barcode hinterlegt werden. Transponder mit hoher Speicherkapazität eignen sich in der Regel, um die auf ihnen hinterlegten Daten bei Bedarf aktualisieren zu k��nnen. Eine dezentrale Datenorganisation ist realisierbar. Ein weiterer Vorteil der RFID-Technologie ist die Möglichkeit, mehrere Datenträger im Bruchteil einer Sekunde zu erfassen. In diesem Fall spricht man von einer Pulkerfassung. Diese Eigenschaft ist besonders im Bereich Warenein- und -ausgang von Interesse. Durch RFID ist es möglich, Ladeeinheiten, z. B. Paletten mit Waren, durch einen Antennenbereich zu fördern, und die mit Transpondern versehenen Artikel zu identifizieren und in die EDV zu übertragen. Neben der Funktionalität einer solchen Technologie steht in der Industrie vor allem die Wirtschaftlichkeit im Vordergrund. Transponder sind heute teuerer als Barcodes. Zudem müssen Investitionen in die für den Betrieb von RFID erforderliche Hard- und Software einkalkuliert werden. Daher muss der Einsatz der RFID-Technologie Einsparungen durch die Reorganisation der Unternehmensprozesse nach sich ziehen. Ein Schwachpunkt der RFID-Technologie ist momentan je nach Anwendung die mangelnde Zuverlässigkeit und Wiederholgenauigkeit bei Pulklesungen. Die Industrie und der Handel brauchen Identifikationstechnologien, deren Erfassungsrate im Bereich nahe 100 % liegt. Die Gefahr besteht darin, dass durch ein unzuverlässiges RFID-System unvollständige bzw. fehlerhafte Datensätze erzeugt werden können. Die Korrektur der Daten kann teurer sein als die durch die Reorganisation der Prozesse mittels RFID erzielten Einsparungen. Die Erfassungsrate der Transponder bei Pulkerfassungen wird durch mehrere Faktoren beeinflusst, die im Folgenden detailliert dargestellt werden. Das Institut für Fördertechnik und Logistik (IFT) in Stuttgart untersucht m��gliche Einflussgrößen auf die Erkennungsraten bei Pulkerfassungen. Mit den gewonnenen Erkenntnissen sollen mögliche Schwachstellen bei der Erkennung mehrerer Transponder im Vorfeld einer Implementierung in die Logistikprozesse eines Unternehmens eliminiert werden. With increasing complexity and variety of the logistics processes the significance of the used information technologies increases. The information accompanying the material flow is necessary in order to be able to identify goods and to be able to use corporate resources optimally. The classical goods entrance is to be mentioned as an example. The notification of amount and kind of incoming goods can be used for previously planning and providing of the personnel and necessary resources. The flow of information is to be understood accordingly as a high-quality feature and as an economic efficiency factor. With increasing complexity and variety of the logistics processes the significance of the used information technologies increases. The information accompanying the material flow is necessary in order to be able to identify goods and to be able to use corporate resources optimally. The classical goods entrance is to be mentioned as an example. The notification of amount and kind of incoming goods can be used for previously planning and providing of the personnel and necessary resources. The flow of information is to be understood accordingly as a high-quality feature and as an economic efficiency factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE To evaluate technical complications and failures of zirconia-based fixed prostheses supported by implants. MATERIALS AND METHODS Consecutive patients received zirconia-based single crowns (SCs) and fixed dental prostheses (FDPs) on implants in a private clinical setting between 2005 and 2010. One dentist performed all surgical and prosthetic procedures, and one master technician performed and coordinated all laboratory procedures. One-piece computer-aided design/ computer-assisted manufacture technology was used to fabricate abutments and frameworks, which were directly connected at the implant level, where possible. All patients were involved in a recall maintenance program and were finally reviewed in 2012. Data on framework fractures, chipping of veneering ceramics, and other technical complications were recorded. The primary endpoint was failure of the prostheses, ie, the need for a complete remake. A life table analysis was calculated. RESULTS A total of 289 implants supported 193 zirconia-based prostheses (120 SCs and 73 FDPs) in 127 patients (51 men, 76 women; average age: 62.5 ± 13.4 years) who were reviewed in 2012. Twenty-five (13%) prostheses were cemented on 44 zirconia abutments and 168 (87%) prostheses were screw-retained directly at the implant level. Fracture of 3 frameworks (1 SC, 2 FDPs) was recorded, and significant chipping resulted in the remake of 3 prostheses (1 SC, 2 FDPs). The 7-year cumulative survival rate was 96.4% ± 1.99%. Minor complications comprised 5 loose screws (these were retightened), small chips associated with 3 prostheses (these were polished), and dislodgement of 3 prostheses (these were recemented). Overall, 176 prostheses remained free of technical problems. CONCLUSIONS Zirconia-based prostheses screwed directly to implants are clinically successful in the short and medium term.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Digital atlases of animal development provide a quantitative description of morphogenesis, opening the path toward processes modeling. Prototypic atlases offer a data integration framework where to gather information from cohorts of individuals with phenotypic variability. Relevant information for further theoretical reconstruction includes measurements in time and space for cell behaviors and gene expression. The latter as well as data integration in a prototypic model, rely on image processing strategies. Developing the tools to integrate and analyze biological multidimensional data are highly relevant for assessing chemical toxicity or performing drugs preclinical testing. This article surveys some of the most prominent efforts to assemble these prototypes, categorizes them according to salient criteria and discusses the key questions in the field and the future challenges toward the reconstruction of multiscale dynamics in model organisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a method based mainly on Data Fusion and Artificial Neural Networks to classify one of the most important pollutants such as Particulate Matter less than 10 micrometer in diameter (PM10) concentrations is proposed. The main objective is to classify in two pollution levels (Non-Contingency and Contingency) the pollutant concentration. Pollutant concentrations and meteorological variables have been considered in order to build a Representative Vector (RV) of pollution. RV is used to train an Artificial Neural Network in order to classify pollutant events determined by meteorological variables. In the experiments, real time series gathered from the Automatic Environmental Monitoring Network (AEMN) in Salamanca Guanajuato Mexico have been used. The method can help to establish a better air quality monitoring methodology that is essential for assessing the effectiveness of imposed pollution controls, strategies, and facilitate the pollutants reduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el trabajo que aquí presentamos se incluye la base teórica (sintaxis y semántica) y una implementación de un framework para codificar el razonamiento de la representación difusa o borrosa del mundo (tal y como nosotros, seres humanos, entendemos éste). El interés en la realización de éste trabajo parte de dos fuentes: eliminar la complejidad existente cuando se realiza una implementación con un lenguaje de programación de los llamados de propósito general y proporcionar una herramienta lo suficientemente inteligente para dar respuestas de forma constructiva a consultas difusas o borrosas. El framework, RFuzzy, permite codificar reglas y consultas en una sintaxis muy cercana al lenguaje natural usado por los seres humanos para expresar sus pensamientos, pero es bastante más que eso. Permite representar conceptos muy interesantes, como fuzzificaciones (funciones usadas para convertir conceptos no difusos en difusos), valores por defecto (que se usan para devolver resultados un poco menos válidos que los que devolveríamos si tuviésemos la información necesaria para calcular los más válidos), similaridad entre atributos (característica que utilizamos para buscar aquellos individuos en la base de datos con una característica similar a la buscada), sinónimos o antónimos y, además, nos permite extender el numero de conectivas y modificadores (incluyendo modificadores de negación) que podemos usar en las reglas y consultas. La personalización de la definición de conceptos difusos (muy útil para lidiar con el carácter subjetivo de los conceptos borrosos, donde nos encontramos con que cualificar a alguien de “alto” depende de la altura de la persona que cualifica) es otra de las facilidades incluida. Además, RFuzzy implementa la semántica multi-adjunta. El interés en esta reside en que introduce la posibilidad de obtener la credibilidad de una regla a partir de un conjunto de datos y una regla dada y no solo el grado de satisfacción de una regla a partir de el universo modelado en nuestro programa. De esa forma podemos obtener automáticamente la credibilidad de una regla para una determinada situación. Aún cuando la contribución teórica de la tesis es interesante en si misma, especialmente la inclusión del modificador de negacion, sus multiples usos practicos lo son también. Entre los diferentes usos que se han dado al framework destacamos el reconocimiento de emociones, el control de robots, el control granular en computacion paralela/distribuída y las busquedas difusas o borrosas en bases de datos. ABSTRACT In this work we provide a theoretical basis (syntax and semantics) and a practical implementation of a framework for encoding the reasoning and the fuzzy representation of the world (as human beings understand it). The interest for this work comes from two sources: removing the existing complexity when doing it with a general purpose programming language (one developed without focusing in providing special constructions for representing fuzzy information) and providing a tool intelligent enough to answer, in a constructive way, expressive queries over conventional data. The framework, RFuzzy, allows to encode rules and queries in a syntax very close to the natural language used by human beings to express their thoughts, but it is more than that. It allows to encode very interesting concepts, as fuzzifications (functions to easily fuzzify crisp concepts), default values (used for providing results less adequate but still valid when the information needed to provide results is missing), similarity between attributes (used to search for individuals with a characteristic similar to the one we are looking for), synonyms or antonyms and it allows to extend the number of connectives and modifiers (even negation) we can use in the rules. The personalization of the definition of fuzzy concepts (very useful for dealing with the subjective character of fuzziness, in which a concept like tall depends on the height of the person performing the query) is another of the facilities included. Besides, RFuzzy implements the multi-adjoint semantics. The interest in them is that in addition to obtaining the grade of satisfaction of a consequent from a rule, its credibility and the grade of satisfaction of the antecedents we can determine from a set of data how much credibility we must assign to a rule to model the behaviour of the set of data. So, we can determine automatically the credibility of a rule for a particular situation. Although the theoretical contribution is interesting by itself, specially the inclusion of the negation modifier, the practical usage of it is equally important. Between the different uses given to the framework we highlight emotion recognition, robocup control, granularity control in parallel/distributed computing and flexible searches in databases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE The decision-making process plays a key role in organizations. Every decision-making process produces a final choice that may or may not prompt action. Recurrently, decision makers find themselves in the dichotomous question of following a traditional sequence decision-making process where the output of a decision is used as the input of the next stage of the decision, or following a joint decision-making approach where several decisions are taken simultaneously. The implication of the decision-making process will impact different players of the organization. The choice of the decision- making approach becomes difficult to find, even with the current literature and practitioners’ knowledge. The pursuit of better ways for making decisions has been a common goal for academics and practitioners. Management scientists use different techniques and approaches to improve different types of decisions. The purpose of this decision is to use the available resources as well as possible (data and techniques) to achieve the objectives of the organization. The developing and applying of models and concepts may be helpful to solve managerial problems faced every day in different companies. As a result of this research different decision models are presented to contribute to the body of knowledge of management science. The first models are focused on the manufacturing industry and the second part of the models on the health care industry. Despite these models being case specific, they serve the purpose of exemplifying that different approaches to the problems and could provide interesting results. Unfortunately, there is no universal recipe that could be applied to all the problems. Furthermore, the same model could deliver good results with certain data and bad results for other data. A framework to analyse the data before selecting the model to be used is presented and tested in the models developed to exemplify the ideas. METHODOLOGY As the first step of the research a systematic literature review on the joint decision is presented, as are the different opinions and suggestions of different scholars. For the next stage of the thesis, the decision-making process of more than 50 companies was analysed in companies from different sectors in the production planning area at the Job Shop level. The data was obtained using surveys and face-to-face interviews. The following part of the research into the decision-making process was held in two application fields that are highly relevant for our society; manufacturing and health care. The first step was to study the interactions and develop a mathematical model for the replenishment of the car assembly where the problem of “Vehicle routing problem and Inventory” were combined. The next step was to add the scheduling or car production (car sequencing) decision and use some metaheuristics such as ant colony and genetic algorithms to measure if the behaviour is kept up with different case size problems. A similar approach is presented in a production of semiconductors and aviation parts, where a hoist has to change from one station to another to deal with the work, and a jobs schedule has to be done. However, for this problem simulation was used for experimentation. In parallel, the scheduling of operating rooms was studied. Surgeries were allocated to surgeons and the scheduling of operating rooms was analysed. The first part of the research was done in a Teaching hospital, and for the second part the interaction of uncertainty was added. Once the previous problem had been analysed a general framework to characterize the instance was built. In the final chapter a general conclusion is presented. FINDINGS AND PRACTICAL IMPLICATIONS The first part of the contributions is an update of the decision-making literature review. Also an analysis of the possible savings resulting from a change in the decision process is made. Then, the results of the survey, which present a lack of consistency between what the managers believe and the reality of the integration of their decisions. In the next stage of the thesis, a contribution to the body of knowledge of the operation research, with the joint solution of the replenishment, sequencing and inventory problem in the assembly line is made, together with a parallel work with the operating rooms scheduling where different solutions approaches are presented. In addition to the contribution of the solving methods, with the use of different techniques, the main contribution is the framework that is proposed to pre-evaluate the problem before thinking of the techniques to solve it. However, there is no straightforward answer as to whether it is better to have joint or sequential solutions. Following the proposed framework with the evaluation of factors such as the flexibility of the answer, the number of actors, and the tightness of the data, give us important hints as to the most suitable direction to take to tackle the problem. RESEARCH LIMITATIONS AND AVENUES FOR FUTURE RESEARCH In the first part of the work it was really complicated to calculate the possible savings of different projects, since in many papers these quantities are not reported or the impact is based on non-quantifiable benefits. The other issue is the confidentiality of many projects where the data cannot be presented. For the car assembly line problem more computational power would allow us to solve bigger instances. For the operation research problem there was a lack of historical data to perform a parallel analysis in the teaching hospital. In order to keep testing the decision framework it is necessary to keep applying more case studies in order to generalize the results and make them more evident and less ambiguous. The health care field offers great opportunities since despite the recent awareness of the need to improve the decision-making process there are many opportunities to improve. Another big difference with the automotive industry is that the last improvements are not spread among all the actors. Therefore, in the future this research will focus more on the collaboration between academia and the health care sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We are grateful to all those who helped with sample collection. This includes the MEDITS survey programme (IEO Mallorca) for Mediterranean samples. Portugal mainland samples were collected under the EU Data Collection Framework (DCF, PNAB). Azores specimens from the Department of Oceanography and Fisheries (DOP) of the University of the Azores (UAc) were collected under the project DEMERSAIS “Monitorização das espécies demersais dos Açores” financed by the Azorean government, and the project DEECON “Unravelling population connectivity for sustainable fisheries in the Deep Sea” project approved by the European Science Foundation (ESF) under the EUROCORES programme (proposal No 06-EuroDEEP-FP-008 & SFRH-EuroDEEP/0002/2007). This study was funded by the UK Natural Environment Research Council (NERC) Oceans 2025 Strategic Research Programme Theme 6 (Science for Sustainable Marine Resources). REB was supported by the Fisheries Society of the British Isles (FSBI), BSP was funded by the Fundação para a Ciência e a Tecnologia, SFRH/BPD/72351/2010 and JL was supported by The Alasdair Downes Marine Conservation Fund.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acknowledgments The authors would like to thank M. N. Cueto and J.M. Antonio (ECOBIOMAR) for their excellent technical support and also Rodrigo López for making the map of the study area. We also thank the personal of the Vigo IEO, for providing information about shad captures at sea collected on the basis of national program (AMDES) included in the European Data Collection Framework (DCF) project. We are also grateful to Comandancia Naval de Tui for providing fishing data. M. Bao is supported by a PhD grant from the University of Aberdeen and also by financial support of the contract from the EU Project PARASITE (grant number 312068). This study was partially supported by a PhD grant from the Portuguese Foundation for Science and Technology (FCT) SFRH/BD/44892/2008) and partially supported by the European Regional Development Fund (ERDF) through the COMPETE—Operational Competitiveness Programme and national funds through Foundation for Science and Technology (FCT), under the project BPEst-C/MAR/ LA0015/2013. The authors thank the staff of the Station of Hydrobiology of the USC BEncoro do Con^ due their participation in the surveys. This work has been partially supported by the project 10PXIB2111059PR of the Xunta de Galicia and the project MIGRANET of the Interreg IV BSUDOE (South-West Europe) Territorial Cooperation Programme (SOE2/P2/E288). D.J. Nachón is supported by a PhD grant from the Xunta de Galicia (PRE/2011/198)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

O uso de veículos aéreos não tripulados (VANTs) tem se tornado cada vez mais comum, principalmente em aplicações de uso civil. No cenário militar, o uso de VANTs tem focado o cumprimento de missões específicas que podem ser divididas em duas grandes categorias: sensoriamento remoto e transporte de material de emprego militar. Este trabalho se concentra na categoria do sensoriamento remoto. O trabalho foca a definição de um modelo e uma arquitetura de referência para o desenvolvimento de sensores inteligentes orientados a missões específicas. O principal objetivo destas missões é a geração de mapas temáticos. Neste trabalho são investigados processos e mecanismos que possibilitem a geração desta categoria de mapas. Neste sentido, o conceito de MOSA (Mission Oriented Sensor Array) é proposto e modelado. Como estudos de caso dos conceitos apresentados são propostos dois sistemas de mapeamento automático de fontes sonoras, um para o caso civil e outro para o caso militar. Essas fontes podem ter origem no ruído gerado por grandes animais (inclusive humanos), por motores de combustão interna de veículos ou por atividade de artilharia (incluindo caçadores). Os MOSAs modelados para esta aplicação são baseados na integração de dados provenientes de um sensor de imageamento termal e uma rede de sensores acústicos em solo. A integração das informações de posicionamento providas pelos sensores utilizados, em uma base cartográfica única, é um dos aspectos importantes tratados neste trabalho. As principais contribuições do trabalho são a proposta de sistemas MOSA, incluindo conceitos, modelos, arquitetura e a implementação de referência representada pelo sistema de mapeamento automático de fontes sonoras.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-06