983 resultados para Pascal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Optimal location on the transport infrastructure is the preferable requirement for many decision making processes. Most studies have focused on evaluating performances of optimally locate p facilities by minimizing their distances to a geographically distributed demand (n) when p and n vary. The optimal locations are also sensitive to geographical context such as road network, especially when they are asymmetrically distributed in the plane. The influence of alternating road network density is however not a very well-studied problem especially when it is applied in a real world context. This paper aims to investigate how the density level of the road network affects finding optimal location by solving the specific case of p-median location problem. A denser network is found needed when a higher number of facilities are to locate. The best solution will not always be obtained in the most detailed network but in a middle density level. The solutions do not further improve or improve insignificantly as the density exceeds 12,000 nodes, some solutions even deteriorate. The hierarchy of the different densities of network can be used according to location and transportation purposes and increase the efficiency of heuristic methods. The method in this study can be applied to other location-allocation problem in transportation analysis where the road network density can be differentiated. 

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The p-median problem is often used to locate P service facilities in a geographically distributed population. Important for the performance of such a model is the distance measure. Distance measure can vary if the accuracy of the road network varies. The rst aim in this study is to analyze how the optimal location solutions vary, using the p-median model, when the road network is alternated. It is hard to nd an exact optimal solution for p-median problems. Therefore, in this study two heuristic solutions are applied, simulating annealing and a classic heuristic. The secondary aim is to compare the optimal location solutions using dierent algorithms for large p-median problem. The investigation is conducted by the means of a case study in a rural region with an asymmetrically distributed population, Dalecarlia. The study shows that the use of more accurate road networks gives better solutions for optimal location, regardless what algorithm that is used and regardless how many service facilities that is optimized for. It is also shown that the simulated annealing algorithm not just is much faster than the classic heuristic used here, but also in most cases gives better location solutions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To have good data quality with high complexity is often seen to be important. Intuition says that the higher accuracy and complexity the data have the better the analytic solutions becomes if it is possible to handle the increasing computing time. However, for most of the practical computational problems, high complexity data means that computational times become too long or that heuristics used to solve the problem have difficulties to reach good solutions. This is even further stressed when the size of the combinatorial problem increases. Consequently, we often need a simplified data to deal with complex combinatorial problems. In this study we stress the question of how the complexity and accuracy in a network affect the quality of the heuristic solutions for different sizes of the combinatorial problem. We evaluate this question by applying the commonly used p-median model, which is used to find optimal locations in a network of p supply points that serve n demand points. To evaluate this, we vary both the accuracy (the number of nodes) of the network and the size of the combinatorial problem (p). The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 supply points we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000 (which is aggregated from the 1.5 million nodes). To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when the accuracy in the road network increase and the combinatorial problem (low p) is simple. When the combinatorial problem is complex (large p) the improvements of increasing the accuracy in the road network are much larger. The results also show that choice of the best accuracy of the network depends on the complexity of the combinatorial (varying p) problem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose a new method for solving large scale p-median problem instances based on real data. We compare different approaches in terms of runtime, memory footprint and quality of solutions obtained. In order to test the different methods on real data, we introduce a new benchmark for the p-median problem based on real Swedish data. Because of the size of the problem addressed, up to 1938 candidate nodes, a number of algorithms, both exact and heuristic, are considered. We also propose an improved hybrid version of a genetic algorithm called impGA. Experiments show that impGA behaves as well as other methods for the standard set of medium-size problems taken from Beasley’s benchmark, but produces comparatively good results in terms of quality, runtime and memory footprint on our specific benchmark based on real Swedish data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Regarding the location of a facility, the presumption in the widely used p-median model is that the customer opts for the shortest route to the nearest facility. However, this assumption is problematic on free markets since the customer is presumed to gravitate to a facility by the distance to and the attractiveness of it. The recently introduced gravity p-median model offers an extension to the p-median model that account for this. The model is therefore potentially interesting, although it has not yet been implemented and tested empirically. In this paper, we have implemented the model in an empirical problem of locating vehicle inspections, locksmiths, and retail stores of vehicle spare-parts for the purpose of investigating its superiority to the p-median model. We found, however, the gravity p-median model to be of limited use for the problem of locating facilities as it either gives solutions similar to the p-median model, or it gives unstable solutions due to a non-concave objective function.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

magine if you will the near perfect statefunded program. Yes, I know, such words sound like an oxymoron, given the track record of both state and federally-funded programs, past, present and undoubtedly in the future. Indeed, such words sound almost mythological in light of recent attempts by the federal government to spend us out of the current recession with still doubtful results (so far, a record deficit). Yet, you’re an imaginative individual and can put aside petty political persiflage and visualize such a program. Not only does this program do precisely what it said it would do, it does it so surprisingly well that, as a taxpayer, you’re completely astonished and whole-heartedly impressed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While PASCAL meet all the requirements of a collaborative funding source, the Palmetto state still starved it of funds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Neste estudo são discutidos alguns aspectos relacionados à escolha da primeira linguagem de programação em currículos de ciência da computação, com interesse especial em Pascal e Java. A primeira linguagem é amplamente adotada para ensinar programação aos novatos, enquanto a segunda está ganhando popularidade como uma linguagem moderna e abrangente, que pode ser usada em muitas disicplinas ao longo de um curso degraduação em computação como ferramenta para ensinar desde recursos básicos de programação até tópicos mais avançados. Embora vários problemas quanto ao ensino de Java, com a primeira linguagem de programação, possam ser apontadas, consideramosque Java é uma boa escolha, visto que (a) oferece apoio a importantes questões conceituais e tecnológicos e, (b) é possível contornar algumas complexidades da linguagem e da plataforma Java para torná-las mais adequadas à alunos iniciantes. Além disso, considerando a grande popularidade de Pascal nos currículos de cursos de computação, uma eventual adoção de Java conduz à outro problema: a falta de professores aptos a lecionar programação orientada a objetos. Sugerimos que este problema de migração de Pascal para Java seja enfrentado através de smplificação do ambiente de desenvolvimento de programas, uso de um pacote com classes que facilitam a entrada e saída, e o desenvolvimento de um catálogo comparativo de programas implementados em ambas as linguagens. Neste estudo também é apresentado o JEduc, um IDE muito simples com o objetivo de dar suporte ao ensino da linguagem de programação orientada a objetos Java aos novatos. Oferece componentes desenvolvidos em Java que integram edição, compilação e execução de programas Java. Além das funcionalidades comuns a um IDE, JEduc foi desenvolvido para gir como uma ferramente pedagógica: simplifica a maioria das mensagens do compilador e erros da JRE, permite a inserção de esqueletos de comandos, e incorpora pacotes especiais para esconder alguns detalhes sintáticos e semânticos indesejáveis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O sucesso da Internet como plataforma de distribuição de sistemas de informação encoraja organizações a disponibilizar serviços presentes em seus sistemas legados nesse ambiente. Uma parte desses sistemas foi desenvolvida na fase inicial do desenvolvimento das aplicações cliente/servidor para banco de dados, usando ambientes visuais com interfaces gráficas tipo WIMP, implementadas sob o paradigma procedimental/estruturado, baseado em objetos e eventos. Como conseqüência, produziu-se sistemas legados difíceis de manter, evoluir e adaptar a novas tecnologias e arquiteturas, pois os projetos desenvolvidos não seguiam, na maioria das vezes, os bons preceitos e práticas modernas defendidas na Engenharia de Software. O objetivo deste trabalho é propor uma metodologia para migrar sistemas legados com as características citadas acima para a plataforma Web. O processo de migração proposto destaca duas estratégias: a elaboração de modelos de classes conceituais da aplicação e o tratamento dado à interface do usuário, para serem utilizados na reconstrução de uma nova aplicação. O processo é baseado em técnicas e métodos de engenharia reversa, que visa obter abstrações por meio de análise estática e dinâmica da aplicação. Na análise dinâmica, destaca-se o mecanismo para recuperar aspectos dos requisitos funcionais do sistema legado e representá-los na ferramenta denominada UC/Re (Use Case para Reengenharia). Todos os artefatos gerados durante o processo podem ser armazenados em um repositório, representando os metamodelos construídos na metodologia. Para delimitar e exemplificar o processo, escolheu-se como domínio de linguagem de programação do software legado, o ambiente Delphi (sob a linguagem Object Pascal). É proposto também um ambiente CASE, no qual é descrito o funcionamento de um protótipo que automatiza grande parte das funcionalidades discutidas nas etapas do processo. Algumas ferramentas desenvolvidas por terceiros são empregadas na redocumentação do sistema legado e na elaboração dos modelos UML do novo sistema. Um estudo de caso, apresentando uma funcionalidade específica de um sistema desenvolvido em Delphi, no paradigma procedimental, é usado para demonstrar o protótipo e serve de exemplo para a validação do processo. Como resultado do processo usando o protótipo, obtém-se o modelo de classes conceituais da nova aplicação no formato XMI (formato padrão para exportação de modelos UML), e gabaritos de páginas em HTML, representando os componentes visuais da interface original na plataforma Web.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

En 2003, le gouvernement brésilien (gestion Lula) a initié une nouvelle phase dans son histoire de l habitation, en intensifiant les constructions de logements sociaux au Brésil. Un tel accroissement a eut des répercussions tant en ville comme à la campagne, et fût marqué dans le Rio Grande do Norte, par la production a grande échelle d ensembles d habitations, dans les programmes de Gouvernement. Afin de viabiliser ces transformations, des instruments politiques, financiers et de gestion ont étés articulés conjointement, utilisant la répétition d une typologie d édification, comme modèle, accompagnée de la reproduction d une morphologie dans les constructions de logements sociaux. Afin de comprendre ce processus nous introduisons une recherche urbanistique et socio-économique du problème du logement social au Brésil, en cherchant à mettre en relation les aspects techniques avec les questions historique, professionnelles et culturelles, éléments complémentaires. Notre analyse cherche a identifier comment les politiques de gestion et financement officielles (administrées dans sa grande majorité par la Caisse Économique Fédérale -CEF-), influencent le processus de conception de projets, en provoquant les répétitions de type/morphologiques, déjà citées. Basée sur l observation directe au cour de deux expériences différenciées pour du logement social en milieu rural, au Rio Grande do Norte, nous montrerons aussi certaines limitations et possibilités des acteurs sociaux, face aux agents et politiques officielles pour le logement social au Brésil, proposant des solutions alternatives standardisées qui caractérisent le résultat des projets financées et gérés par la CEF. Nos principales références théoriques et méthodologiques sont Nabil Bonduki (1998), David Harvey (2009,1982), Henry Lefèbvre (1970), Ermínia Maricato (2010, 2009, 2000, 1987) et Raquel Rolnik (2010, 2009, 2008, 1997)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An automatic Procedure with a high current-density anodic electrodissolution unit (HDAE) is proposed for the determination of aluminium, copper and zinc in non-ferroalloys by flame atonic absorption spectrometry, based on the direct solid analysis. It consists of solenoid valve-based commutation in a flow-injection system for on-line sample electro-dissolution and calibration with one multi-element standard, an electrolytic cell equipped with two electrodes (a silver needle acts as cathode, and sample as anode), and an intelligent unit. The latter is assembled in a PC-compatible microcomputer for instrument control, and far data acquisition and processing. General management of the process is achieved by use of software written in Pascal. Electrolyte compositions, flow rates, commutation times, applied current and electrolysis time mere investigated. A 0.5 mol l(-1) HNO3 solution was elected as electrolyte and 300 A/cm(2) as the continuous current pulse. The performance of the proposed system was evaluated by analysing aluminium in Al-allay samples, and copper/zinc in brass and bronze samples, respectively. The system handles about 50 samples per hour. Results are precise (R.S.D < 2%) and in agreement with those obtained by ICP-AES and spectrophotometry at a 95% confidence level.