869 resultados para NETWORK DESIGN PROBLEMS
Resumo:
In the past, the focus of drainage design was on sizing pipes and storages in order to provide sufficient network capacity. This traditional approach, together with computer software and technical guidance, had been successful for many years. However, due to rapid population growth and urbanisation, the requirements of a “good” drainage design have also changed significantly. In addition to water management, other aspects such as environmental impacts, amenity values and carbon footprint have to be considered during the design process. Going forward, we need to address the key sustainability issues carefully and practically. The key challenge of moving from simple objectives (e.g. capacity and costs) to complicated objectives (e.g. capacity, flood risk, environment, amenity etc) is the difficulty to strike a balance between various objectives and to justify potential benefits and compromises. In order to assist decision makers, we developed a new decision support system for drainage design. The system consists of two main components – a multi-criteria evaluation framework for drainage systems and a multi-objective optimisation tool. The evaluation framework is used for the quantification of performance, life-cycle costs and benefits of different drainage systems. The optimisation tool can search for feasible combinations of design parameters such as the sizes, order and type of drainage components that maximise multiple benefits. In this paper, we will discuss real-world application of the decision support system. A number of case studies have been developed based on recent drainage projects in China. We will use the case studies to illustrate how the evaluation framework highlights and compares the pros and cons of various design options. We will also discuss how the design parameters can be optimised based on the preferences of decision makers. The work described here is the output of an EngD project funded by EPSRC and XP Solutions.
Resumo:
The cluster provides a greater commercial relationship between the companies that comprise it. This encourages companies to adopt competitive structures that allow solving problems that would hardly alone (Lubeck et. Al., 2011). With that this paper aims to describe the coopetition between companies operating on a commercial cluster planned, from the point of view of retailers, taking as a basis the theoretical models proposed by Bengtsson and Kock (1999) and Leon (2005) and operationalized by means of Social Network Analysis (SNA). Data collection consisted of two phases, the first exploratory aspect to identify the actors, and the second was characterized as descriptive as it aims to describe the coopetition among the enterprises. As a result we identified the companies that cooperate and compete simultaneously (coopetition), firms that only compete, companies just cooperate and businesses that do not compete and do not cooperate (coexistence)
Resumo:
Neste trabalho é dado ênfase à inclusão das incertezas na avaliação do comportamento estrutural, objetivando uma melhor representação das características do sistema e uma quantificação do significado destas incertezas no projeto. São feitas comparações entre as técnicas clássicas existentes de análise de confiabilidade, tais como FORM, Simulação Direta Monte Carlo (MC) e Simulação Monte Carlo com Amostragem por Importância Adaptativa (MCIS), e os métodos aproximados da Superfície de Resposta( RS) e de Redes Neurais Artificiais(ANN). Quando possível, as comparações são feitas salientando- se as vantagens e inconvenientes do uso de uma ou de outra técnica em problemas com complexidades crescentes. São analisadas desde formulações com funções de estado limite explícitas até formulações implícitas com variabilidade espacial de carregamento e propriedades dos materiais, incluindo campos estocásticos. É tratado, em especial, o problema da análise da confiabilidade de estruturas de concreto armado incluindo o efeito da variabilidade espacial de suas propriedades. Para tanto é proposto um modelo de elementos finitos para a representação do concreto armado que incorpora as principais características observadas neste material. Também foi desenvolvido um modelo para a geração de campos estocásticos multidimensionais não Gaussianos para as propriedades do material e que é independente da malha de elementos finitos, assim como implementadas técnicas para aceleração das avaliações estruturais presentes em qualquer das técnicas empregadas. Para o tratamento da confiabilidade através da técnica da Superfície de Resposta, o algoritmo desenvolvido por Rajashekhar et al(1993) foi implementado. Já para o tratamento através de Redes Neurais Artificias, foram desenvolvidos alguns códigos para a simulação de redes percéptron multicamada e redes com função de base radial e então implementados no algoritmo de avaliação de confiabilidade desenvolvido por Shao et al(1997). Em geral, observou-se que as técnicas de simulação tem desempenho bastante baixo em problemas mais complexos, sobressaindo-se a técnica de primeira ordem FORM e as técnicas aproximadas da Superfície de Resposta e de Redes Neurais Artificiais, embora com precisão prejudicada devido às aproximações presentes.
Resumo:
Electronic applications are currently developed under the reuse-based paradigm. This design methodology presents several advantages for the reduction of the design complexity, but brings new challenges for the test of the final circuit. The access to embedded cores, the integration of several test methods, and the optimization of the several cost factors are just a few of the several problems that need to be tackled during test planning. Within this context, this thesis proposes two test planning approaches that aim at reducing the test costs of a core-based system by means of hardware reuse and integration of the test planning into the design flow. The first approach considers systems whose cores are connected directly or through a functional bus. The test planning method consists of a comprehensive model that includes the definition of a multi-mode access mechanism inside the chip and a search algorithm for the exploration of the design space. The access mechanism model considers the reuse of functional connections as well as partial test buses, cores transparency, and other bypass modes. The test schedule is defined in conjunction with the access mechanism so that good trade-offs among the costs of pins, area, and test time can be sought. Furthermore, system power constraints are also considered. This expansion of concerns makes it possible an efficient, yet fine-grained search, in the huge design space of a reuse-based environment. Experimental results clearly show the variety of trade-offs that can be explored using the proposed model, and its effectiveness on optimizing the system test plan. Networks-on-chip are likely to become the main communication platform of systemson- chip. Thus, the second approach presented in this work proposes the reuse of the on-chip network for the test of the cores embedded into the systems that use this communication platform. A power-aware test scheduling algorithm aiming at exploiting the network characteristics to minimize the system test time is presented. The reuse strategy is evaluated considering a number of system configurations, such as different positions of the cores in the network, power consumption constraints and number of interfaces with the tester. Experimental results show that the parallelization capability of the network can be exploited to reduce the system test time, whereas area and pin overhead are strongly minimized. In this manuscript, the main problems of the test of core-based systems are firstly identified and the current solutions are discussed. The problems being tackled by this thesis are then listed and the test planning approaches are detailed. Both test planning techniques are validated for the recently released ITC’02 SoC Test Benchmarks, and further compared to other test planning methods of the literature. This comparison confirms the efficiency of the proposed methods.
Resumo:
Over the past two decades there has been a profusion of empirical studies of organizational design and its relationship to efficiency, productivity and flexibility of an organization. In parallel, there has been a wide range of studies about innovation management in different kind of industries and firms. However, with some exceptions, the organizational and innovation management bodies of literature tend to examine the issues of organizational design and innovation management individually, mainly in the context of large firms operating at the technological frontier. There seems to be a scarcity of empirical studies that bring together organizational design and innovation and examine them empirically and over time in the context of small and medium sized enterprises. This dissertation seeks to provide a small contribution in that direction. This dissertation examines the dynamic relationship between organizational design and innovation. This relationship is examined on the basis of a single-case design in a medium sized mechanical engineering company in Germany. The covered time period ranges from 1958 until 2009, although the actual focus falls on the recent past. This dissertation draws on first-hand qualitative empirical evidence gathered through extensive field work. The main findings are: 1. There is always a bundle of organizational dimensions which impacts innovation. These main organizational design dimensions are: (1) Strategy & Leadership, (2) Resources & Capabilities, (3) Structure, (4) Culture, (5) Networks & Partnerships, (6) Processes and (7) Knowledge Management. However, the importance of the different organizational design dimensions changes over time. While for example for the production of simple, standardized parts, a simple organizational design was appropriate, the company needed to have a more advanced organizational design in order to be able to produce customized, complex parts with high quality. Hence the technological maturity of a company is related to its organizational maturity. 2. The introduction of innovations of the analyzed company were highly dependent on organizational conditions which enabled their introduction. The results of the long term case study show, that some innovations would not have been introduced successfully if the organizational elements like for example training and qualification, the build of network and partnerships or the acquisition of appropriate resources and capabilities, were not in place. Hence it can be concluded, that organizational design is an enabling factor for innovation. These findings contribute to advance our understanding of the complex relationship between organizational design and innovation. This highlights the growing importance of a comprehensive, innovation stimulating organizational design of companies. The results suggest to managers that innovation is not only dependent on a single organizational factor but on the appropriate, comprehensive design of the organization. Hence manager should consider to review regularly the design of their organizations in order to maintain a innovation stimulating environment.
Resumo:
The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.
Resumo:
The most important issues in auction design are the traditional concerns of competition policy preventing collusive, predatory, and entry-deterring behaviour. Ascending and uniform-price auctions are particularly vulnerable to these problems, and the Anglo-Dutch auction a hybrid of the sealed-bid and ascending auctions may often perform better. Effective anti-trust policy is also critical. However, everything depends on the details of the context; the circum- stances of the recent U.K. mobile-phone license auction made an ascending format ideal, but this author (and others) correctly predicted the same for- mat would fail in the Netherlands and elsewhere. Auction design is not one size Þts all . We also discuss the 3G spectrum auctions in Germany, Italy, Austria and Switzerland, and football TV-rights, TV franchise and other radiospectrum auctions, electricity markets, and takeover battles.
Resumo:
Esta dissertação se propõe a cartografar as redes sociotécnicas do design no campo do management nos moldes propostos pela Teoria Ator-Rede e apresentar o processo de translação pelo qual passou o termo ao adentrar no campo. Para tal, levantou e analisou artigos publicados sobre o tema nos principais periódicos da área de organizações e publicações. Estes textos demonstram como, nas últimas décadas, o design tem passado por uma expansão de sentido e aplicação na direção do management (ou do management no sentido do design), através das abordagens denominadas design thinking, design science ou design process. A pesquisa se justifica, uma vez que este assunto está presente nos principais periódicos do management e dos estudos organizacionais, como uma importante ferramenta para solução de problemas que desafiam os sistemas organizacionais, como: a mudança, o empreendedorismo e a inovação (Stephens & Boland, 2014). É importante destacar que o design tem sido cada vez mais considerado uma atividade decisiva na batalha econômica (Callon, 1986), na determinação dos atuais estilos de vida (lifestyle) e na construção de nosso mundo futuro. No campo dos estudos organizacionais, como demonstrou esta pesquisa, o design surge como uma abordagem que supera a dicotomia entre positivismo e a abordagem crítica na teoria organizacional (Jelinek, Romme & Boland, 2008). Por fim, esta dissertação se ateve à cartografia das redes sociotécnicas e à descrição das quatro principais fases do processo de translação do design no campo do management, a saber: (a) problematização, marcada pela publicação de The Sciences of Artificial em 1969 de Herbert A. Simon, no qual, ele argumenta pelo design como uma habilidade básica para todas as especialidades profissionais, incluindo a gestão (Simon, 1996), (b) interessamento, designers defendendo um design de sistemas complexos como as organizações, (c) engajamento, designers e teóricos das organizações juntos pelo design no management como uma alternativa para a superação da dicotomia entre positivismo e os estudos críticos na administração, e, (d) mobilização, na qual os teóricos das organizações partem em defesa do design no management como um forma de dar conta de modelos organizacionais contemporâneos com fronteiras mais permeáveis e em constante reformulação
Resumo:
In this dissertation, different ways of combining neural predictive models or neural-based forecasts are discussed. The proposed approaches consider mostly Gaussian radial basis function networks, which can be efficiently identified and estimated through recursive/adaptive methods. Two different ways of combining are explored to get a final estimate – model mixing and model synthesis –, with the aim of obtaining improvements both in terms of efficiency and effectiveness. In the context of model mixing, the usual framework for linearly combining estimates from different models is extended, to deal with the case where the forecast errors from those models are correlated. In the context of model synthesis, and to address the problems raised by heavily nonstationary time series, we propose hybrid dynamic models for more advanced time series forecasting, composed of a dynamic trend regressive model (or, even, a dynamic harmonic regressive model), and a Gaussian radial basis function network. Additionally, using the model mixing procedure, two approaches for decision-making from forecasting models are discussed and compared: either inferring decisions from combined predictive estimates, or combining prescriptive solutions derived from different forecasting models. Finally, the application of some of the models and methods proposed previously is illustrated with two case studies, based on time series from finance and from tourism.
Resumo:
This study aims to analyze and compare the opinion of professionals, managers and users about the mental health care in the Family Health Strategy (FHS). It is characterized as an Operations Research or Health System Research with a cross-sectional design and a descriptive quantitative nature. The study was developed from the application of the Opinion Measurement Scale allied to techniques of observation and structured interview in the city of Parnamirim / RN. The sample consists of 409 subjects, 209 professionals of the Family Health Strategy, 30 of the Oral Health Strategy, 19 of the Family Health Support Center, 24 directors of Basic Health Units, plus 68 users with mental disorders and 59 caregivers, respecting the ethical parameters of Resolution 196/96 of the National Health Council, trial registration number: CAAE 0003.0.051.000-11. Quantitative data were submitted to the Epi-info 3.5.2 for analysis. The network of mental health in Parnamirim involves the flow between the FHS, Psychosocial Care Centers, clinics and hospitals, having as main barriers the fragility of the referral and counter-referral system, of the municipal health conferences, of the FHS teams by the limitations in material and human resources as well as the population´s lack of acknowledge about the organization of the mental health network, issues that affect the integral attention. Even though the FHS professionals recognize the importance of their actions, they question their role in mental health care, experiencing difficulties in accessing psychiatric services (76.5%). Although most agree that the mentally ill is best treated in the family than in hospital (65.2%), the community health workers were the predominant category in the partial or total disagreement of this statement (40.8%), who is the professional in greater contact with the family. Nevertheless the caregivers miss the support of the FHS as the main focus of attention is on revenue control. The views of professionals, mental patients and caregivers converged in several statements, showing the main weaknesses to be focused by the mental health network of the city, as the perceptions that: (a) physical strength is needed to take care of mental patients for its tendency to aggression, requiring it to stay in the sanatorium for representing danger to society, (b) only a psychiatrist can help the person with emotional problems, (c) the user of alcohol and drugs does not necessarily develop mental illness, (d) the access barriers and doubts about the quality of psychiatric services, (e) caring of a mental health patient does not bring suffering to professionals. Therefore, the commitment to consensus building, monitoring and evaluation of the network are important mechanisms for an effective management system, reflecting in the importance of strengthening the health conferences and approximating different institutions. The results reinforce the importance of strengthening primary care through programs of continuing education focusing on the actions and functions of professionals in accordance with its competences and duties what contribute to the organization and response of mental health care, favoring user´s care and the promotion of family health
Resumo:
This work proposes hardware architecture, VHDL described, developed to embedded Artificial Neural Network (ANN), Multilayer Perceptron (MLP). The present work idealizes that, in this architecture, ANN applications could easily embed several different topologies of MLP network industrial field. The MLP topology in which the architecture can be configured is defined by a simple and specifically data input (instructions) that determines the layers and Perceptron quantity of the network. In order to set several MLP topologies, many components (datapath) and a controller were developed to execute these instructions. Thus, an user defines a group of previously known instructions which determine ANN characteristics. The system will guarantee the MLP execution through the neural processors (Perceptrons), the components of datapath and the controller that were developed. In other way, the biases and the weights must be static, the ANN that will be embedded must had been trained previously, in off-line way. The knowledge of system internal characteristics and the VHDL language by the user are not needed. The reconfigurable FPGA device was used to implement, simulate and test all the system, allowing application in several real daily problems
Resumo:
This work was motivated by the importance of conducting a study of vehicle emissions in captive fleets with diesel engine, coupled with the predictive maintenance plan. This type of maintenance includes techniques designed to meet the growing market demand to reduce maintenance costs by increasing the reliability of diagnoses, which has increased interest in automated predictive maintenance on diesel engines, preventing problems that might evolve into routine turn into serious situations, solved only with complex and costly repairs, the Reliability Centered Maintenance, will be the methodology that will make our goal is reached, beyond maintaining the vehicles regulated as fuel consumption and emissions. To Therefore, technical improvements were estimated capable of penetrating the automotive market and give the inshore fleet emission rates of opacity of the vehicles, being directly related to the conditions of the lubricating oil thus contributing to reducing maintenance costs by contributing significantly to emissions of pollutants and an improvement in the air in large cities. This criterion was adopted and implemented, em 241 buses and produced a diagnosis of possible failures by the correlation between the characterization of used lubricating oils and the analysis of opacity, with the objective of the aid the detection and solution of failures for the maintenance of sub-systems according to design criteria, and for this to be a deductive methodology to determine potential causes of failures, has been automated to implement a predictive maintenance system for this purpose was used in our study a mobile unit equipped with a opacimeter and a kit for collection and analysis of lubricating oil and the construction of the network diagnostics, we used a computer program in Microsoft Office Access 2007 platform tool is indispensable for creating a database data, this method is being used and successfully implemented in seven (7) bus companies from the city of Natal (RN) Brazil
Resumo:
With water pollution increment at the last years, so many progresses in researches about treatment of contaminated waters have been developed. In wastewaters containing highly toxic organic compounds, which the biological treatment cannot be applied, the Advanced Oxidation Processes (AOP) is an alternative for degradation of nonbiodegradable and toxic organic substances, because theses processes are generation of hydroxyl radical based on, a highly reactivate substance, with ability to degradate practically all classes of organic compounds. In general, the AOP request use of special ultraviolet (UV) lamps into the reactors. These lamps present a high electric power demand, consisting one of the largest problems for the application of these processes in industrial scale. This work involves the development of a new photochemistry reactor composed of 12 low cost black light fluorescent lamps (SYLVANIA, black light, 40 W) as UV radiation source. The studied process was the photo-Fenton system, a combination of ferrous ions, hydrogen peroxide, and UV radiation, it has been employed for the degradation of a synthetic wastewater containing phenol as pollutant model, one of the main pollutants in the petroleum industry. Preliminary experiments were carrier on to estimate operational conditions of the reactor, besides the effects of the intensity of radiation source and lamp distribution into the reactor. Samples were collected during the experiments and analyzed for determining to dissolved organic carbon (DOC) content, using a TOC analyzer Shimadzu 5000A. The High Performance Liquid Chromatography (HPLC) was also used for identification of the cathecol and hydroquinone formed during the degradation process of the phenol. The actinometry indicated 9,06⋅1018 foton⋅s-1 of photons flow, for 12 actived lamps. A factorial experimental design was elaborated which it was possible to evaluate the influence of the reactants concentration (Fe2+ and H2O2) and to determine the most favorable experimental conditions ([Fe2+] = 1,6 mM and [H2O2] = 150,5 mM). It was verified the increase of ferrous ions concentration is favorable to process until reaching a limit when the increase of ferrous ions presents a negative effect. The H2O2 exhibited a positive effect, however, in high concentrations, reaching a maximum ratio degradation. The mathematical modeling of the process was accomplished using the artificial neural network technique
Resumo:
Several positioning techniques have been developed to explore the GPS capability to provide precise coordinates in real time. However, a significant problem to all techniques is the ionosphere effect and the troposphere refraction. Recent researches in Brazil, at São Paulo State University (UNESP), have been trying to tackle these problems. In relation to the ionosphere effects it has been developed a model named Mod_Ion. Concerning tropospheric refraction, a model of Numerical Weather Prediction(NWP) has been used to compute the zenithal tropospheric delay (ZTD). These two models have been integrated with two positioning methods: DGPS (Differential GPS) and network RTK (Real Time Kinematic). These two positioning techniques are being investigated at São Paulo State University (UNESP), Brazil. The in-house DGPS software was already finalized and has provided very good results. The network RTK software is still under development. Therefore, only preliminary results from this method using the VRS (Virtual Reference Station) concept are presented.
Resumo:
Neural networks and wavelet transform have been recently seen as attractive tools for developing eficient solutions for many real world problems in function approximation. Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. So, mathematical model is a very important tool to guarantee the development of the neural network area. In this article we will introduce one series of mathematical demonstrations that guarantee the wavelets properties for the PPS functions. As application, we will show the use of PPS-wavelets in pattern recognition problems of handwritten digit through function approximation techniques.