964 resultados para script-driven test program generation process
Resumo:
Polymeric microdrops of low viscosity, elastic fluids have been generated in T-shaped microfluidic devices using a cross-flow shear-induced drop generation process. Dilute (c/c* similar to 0.5) aqueous solutions of polyethylene oxide (PEO) of various molecular weights (3 x 10(5) -2 x 10(6) g/mol) were used as the drop phase fluids whilst silicone oils (5 mPa s
Resumo:
Very large spatially-referenced datasets, for example, those derived from satellite-based sensors which sample across the globe or large monitoring networks of individual sensors, are becoming increasingly common and more widely available for use in environmental decision making. In large or dense sensor networks, huge quantities of data can be collected over small time periods. In many applications the generation of maps, or predictions at specific locations, from the data in (near) real-time is crucial. Geostatistical operations such as interpolation are vital in this map-generation process and in emergency situations, the resulting predictions need to be available almost instantly, so that decision makers can make informed decisions and define risk and evacuation zones. It is also helpful when analysing data in less time critical applications, for example when interacting directly with the data for exploratory analysis, that the algorithms are responsive within a reasonable time frame. Performing geostatistical analysis on such large spatial datasets can present a number of problems, particularly in the case where maximum likelihood. Although the storage requirements only scale linearly with the number of observations in the dataset, the computational complexity in terms of memory and speed, scale quadratically and cubically respectively. Most modern commodity hardware has at least 2 processor cores if not more. Other mechanisms for allowing parallel computation such as Grid based systems are also becoming increasingly commonly available. However, currently there seems to be little interest in exploiting this extra processing power within the context of geostatistics. In this paper we review the existing parallel approaches for geostatistics. By recognising that diffeerent natural parallelisms exist and can be exploited depending on whether the dataset is sparsely or densely sampled with respect to the range of variation, we introduce two contrasting novel implementations of parallel algorithms based on approximating the data likelihood extending the methods of Vecchia [1988] and Tresp [2000]. Using parallel maximum likelihood variogram estimation and parallel prediction algorithms we show that computational time can be significantly reduced. We demonstrate this with both sparsely sampled data and densely sampled data on a variety of architectures ranging from the common dual core processor, found in many modern desktop computers, to large multi-node super computers. To highlight the strengths and weaknesses of the diffeerent methods we employ synthetic data sets and go on to show how the methods allow maximum likelihood based inference on the exhaustive Walker Lake data set.
Resumo:
Fast pyrolysis of biomass produces a liquid bio-oil that can be used for electricity generation. Bio-oil can be stored and transported so it is possible to decouple the pyrolysis process from the generation process. This allows each process to be separately optimised. It is necessary to have an understanding of the transport costs involved in order to carry out techno-economic assessments of combinations of remote pyrolysis plants and generation plants. Published fixed and variable costs for freight haulage have been used to calculate the transport cost for trucks running between field stores and a pyrolysis plant. It was found that the key parameter for estimating these costs was the number of round trips a day a truck could make rather than the distance covered. This zone costing approach was used to estimate the transport costs for a range of pyrolysis plants size for willow woodchips and baled miscanthus. The possibility of saving transport costs by producing bio-oil near to the field stores and transporting the bio-oil to a central plant was investigated and it was found that this would only be cost effective for large generation plants.
Resumo:
A large number of studies have been devoted to modeling the contents and interactions between users on Twitter. In this paper, we propose a method inspired from Social Role Theory (SRT), which assumes that a user behaves differently in different roles in the generation process of Twitter content. We consider the two most distinctive social roles on Twitter: originator and propagator, who respectively posts original messages and retweets or forwards the messages from others. In addition, we also consider role-specific social interactions, especially implicit interactions between users who share some common interests. All the above elements are integrated into a novel regularized topic model. We evaluate the proposed method on real Twitter data. The results show that our method is more effective than the existing ones which do not distinguish social roles. Copyright 2013 ACM.
Resumo:
The project “Reference in Discourse” deals with the selection of a specific object from a visual scene in a natural language situation. The goal of this research is to explain this everyday discourse reference task in terms of a concept generation process based on subconceptual visual and verbal information. The system OINC (Object Identification in Natural Communicators) aims at solving this problem in a psychologically adequate way. The system’s difficulties occurring with incomplete and deviant descriptions correspond to the data from experiments with human subjects. The results of these experiments are reported.
Resumo:
Красимир Манев, Нели Манева, Хараламби Хараламбиев - Подходът с използване на бизнес правила (БП) беше въведен в края на миналия век, за да се улесни специфицирането на фирмен софтуер и да може той да задоволи по-добре нуждите на съответния бизнес. Днес повечето от целите на подхода са постигнати. Но усилията, в научно-изследователски и практически аспект, за постигане на „’формална основа за обратно извличане на БП от съществуващи системи “продължават. В статията е представен подход за извличане на БП от програмен код, базиран на методи за статичен анализ на кода. Посочени са някои предимства и недостатъци на такъв подход.
Resumo:
Enfoca a educação continuada e a sua relação com a formação profissional do bibliotecário. Seu objetivo é refletir como integrar a capacitação contínua à formação desse profissional no contexto brasileiro. Pois, aspectos de ordem teórico-práticos apontam para a necessidade de maior incorporação de procedimentos metodológicos no fazer bibliotecário. Isto se dá em função de que as operações empíricas têm se tornado insuficientes para garantir a qualidade da informação gerada no âmbito de sistemas de informação. Assim, com base em ações-reflexões a partir desse contexto, identificou-se que necessário se faz adotar programas que possibilitem a capacitação desse profissional na formação-em-serviço, para melhor qualificar o processo de geração, transferência e uso da informação________________________________________________________________________________ABSTRACT Focus the continuous education and its relation with the professional formation of the librarian. Its objective is to reflect as to integrate the continuous qualification to the formation of this professional in the Brazilian context. Therefore, theoretician-practical aspects of order point with respect to the necessity of bigger incorporation of research procedures in making librarian. This if of the one in function of that the empirical operations if have become insufficient to guarantee the quality of the information generated in the scope of information systems. Thus, on the basis of action-reflections from this context, it was identified that necessary if it makes to adopt programs that make possible the qualification of this professional in the formation-in-service, better to characterize the generation process, transference and use of the information
Resumo:
The dynamic interaction of vehicles and bridges results in live loads being induced into bridges that are greater than the vehicle’s static weight. To limit this dynamic effect, the Iowa Department of Transportation (DOT) currently requires that permitted trucks slow to five miles per hour and span the roadway centerline when crossing bridges. However, this practice has other negative consequences such as the potential for crashes, impracticality for bridges with high traffic volumes, and higher fuel consumption. The main objective of this work was to provide information and guidance on the allowable speeds for permitted vehicles and loads on bridges .A field test program was implemented on five bridges (i.e., two steel girder bridges, two pre-stressed concrete girder bridges, and one concrete slab bridge) to investigate the dynamic response of bridges due to vehicle loadings. The important factors taken into account during the field tests included vehicle speed, entrance conditions, vehicle characteristics (i.e., empty dump truck, full dump truck, and semi-truck), and bridge geometric characteristics (i.e., long span and short span). Three entrance conditions were used: As-is and also Level 1 and Level 2, which simulated rough entrance conditions with a fabricated ramp placed 10 feet from the joint between the bridge end and approach slab and directly next to the joint, respectively. The researchers analyzed and utilized the field data to derive the dynamic impact factors (DIFs) for all gauges installed on each bridge under the different loading scenarios.
Resumo:
Continuous delivery (CD) is a software engineering approach where the focus lays on creating a short delivery cycle by automating parts of the deployment pipeline which includes build, deploy-, test and release process. CD is based on that during development should be possible to always automatically generate a release based on the source code in its current state. One of CD's many advantages is that through continuous releases it allows you to get a quick feedback loop leading to faster and more efficient implementation of new functions, at the same time fixing errors. Although CD has many advantages, there are also several challenges a maintenance management project must manage in the transition to CD. These challenges may differ depending on the maturity level for a maintenance management project and what strengths and weaknesses the project has. Our research question was: "What challenges can a maintenance management project face in transition to Continuous delivery?" The purpose of this study is to describe Continuous delivery and the challenges a maintenance management project may face during a transition to Continuous delivery. A descriptive case study has been carried out with the data collection methods of interviews and documents. A situation analysis was created based on the collected data in a shape of a process model that represent the maintenance management projects release process. The processmodel was used as the basis of SWOT analysis and analysis by Rehn et al's Maturity Model. From these analyzes we found challenges of a maintenance management project may face in the transition to CD. The challenges are about customers and the management's attitude towards a transition to CD. But the biggest challenge is about automation of the deployment pipeline steps.
Resumo:
Test av mjukvara görs i syfte att se ifall systemet uppfyller specificerade krav samt för att hitta fel. Det är en viktig del i systemutveckling och involverar bland annat regressionstestning. Regressionstester utförs för att säkerställa att en ändring i systemet inte medför att andra delar i systemet påverkas negativt. Dokumenthanteringssystem hanterar ofta känslig data hos organisationer vilket ställer höga krav på säkerheten. Behörigheter i system måste därför testas noggrant för att säkerställa att data inte hamnar i fel händer. Dokumenthanteringssystem gör det möjligt för flera organisationer att samla sina resurser och kunskaper för att nå gemensamma mål. Gemensamma arbetsprocesser stöds med hjälp av arbetsflöden som innehåller ett antal olika tillstånd. Vid dessa olika tillstånd gäller olika behörigheter. När en behörighet ändras krävs regressionstester för att försäkra att ändringen inte har gjort inverkan på andra behörigheter. Denna studie har utförts som en kvalitativ fallstudie vars syfte var att beskriva utmaningar med regressionstestning av roller och behörigheter i arbetsflöden för dokument i dokumenthanteringssystem. Genom intervjuer och en observation så framkom det att stora utmaningar med dessa tester är att arbetsflödens tillstånd följer en förutbestämd sekvens. För att fullfölja denna sekvens så involveras en enorm mängd behörigheter som måste testas. Det ger ett mycket omfattande testarbete avseende bland annat tid och kostnad. Studien har riktat sig mot dokumenthanteringssystemet ProjectWise som förvaltas av Trafikverket. Beslutsunderlag togs fram för en teknisk lösning för automatiserad regressionstestning av roller och behörigheter i arbetsflöden åt ProjectWise. Utifrån en kravinsamling tillhandahölls beslutsunderlag som involverade Team Foundation Server (TFS), Coded UI och en nyckelordsdriven testmetod som en teknisk lösning. Slutligen jämfördes vilka skillnader den tekniska lösningen kan utgöra mot manuell testning. Utifrån litteratur, dokumentstudie och förstahandserfarenheter visade sig testautomatisering kunna utgöra skillnader inom ett antal identifierade problemområden, bland annat tid och kostnad.
Resumo:
Structuring integrated social-ecological systems (SES) research remains a core challenge for achieving sustainability. Numerous concepts and frameworks exist, but there is a lack of mutual learning and orientation of knowledge between them. We focus on two approaches in particular: the ecosystem services concept and Elinor Ostrom’s diagnostic SES framework. We analyze the strengths and weaknesses of each and discuss their potential for mutual learning. We use knowledge types in sustainability research as a boundary object to compare the contributions of each approach. Sustainability research is conceptualized as a multi-step knowledge generation process that includes system, target, and transformative knowledge. A case study of the Southern California spiny lobster fishery is used to comparatively demonstrate how each approach contributes a different lens and knowledge when applied to the same case. We draw on this case example in our discussion to highlight potential interlinkages and areas for mutual learning. We intend for this analysis to facilitate a broader discussion that can further integrate SES research across its diverse communities.
Resumo:
Enfoca a educação continuada e a sua relação com a formação profissional do bibliotecário. Seu objetivo é refletir como integrar a capacitação contínua à formação desse profissional no contexto brasileiro. Pois, aspectos de ordem teórico-práticos apontam para a necessidade de maior incorporação de procedimentos metodológicos no fazer bibliotecário. Isto se dá em função de que as operações empíricas têm se tornado insuficientes para garantir a qualidade da informação gerada no âmbito de sistemas de informação. Assim, com base em ações-reflexões a partir desse contexto, identificou-se que necessário se faz adotar programas que possibilitem a capacitação desse profissional na formação-em-serviço, para melhor qualificar o processo de geração, transferência e uso da informação________________________________________________________________________________ABSTRACT Focus the continuous education and its relation with the professional formation of the librarian. Its objective is to reflect as to integrate the continuous qualification to the formation of this professional in the Brazilian context. Therefore, theoretician-practical aspects of order point with respect to the necessity of bigger incorporation of research procedures in making librarian. This if of the one in function of that the empirical operations if have become insufficient to guarantee the quality of the information generated in the scope of information systems. Thus, on the basis of action-reflections from this context, it was identified that necessary if it makes to adopt programs that make possible the qualification of this professional in the formation-in-service, better to characterize the generation process, transference and use of the information
Resumo:
Many existing encrypted Internet protocols leak information through packet sizes and timing. Though seemingly innocuous, prior work has shown that such leakage can be used to recover part or all of the plaintext being encrypted. The prevalence of encrypted protocols as the underpinning of such critical services as e-commerce, remote login, and anonymity networks and the increasing feasibility of attacks on these services represent a considerable risk to communications security. Existing mechanisms for preventing traffic analysis focus on re-routing and padding. These prevention techniques have considerable resource and overhead requirements. Furthermore, padding is easily detectable and, in some cases, can introduce its own vulnerabilities. To address these shortcomings, we propose embedding real traffic in synthetically generated encrypted cover traffic. Novel to our approach is our use of realistic network protocol behavior models to generate cover traffic. The observable traffic we generate also has the benefit of being indistinguishable from other real encrypted traffic further thwarting an adversary's ability to target attacks. In this dissertation, we introduce the design of a proxy system called TrafficMimic that implements realistic cover traffic tunneling and can be used alone or integrated with the Tor anonymity system. We describe the cover traffic generation process including the subtleties of implementing a secure traffic generator. We show that TrafficMimic cover traffic can fool a complex protocol classification attack with 91% of the accuracy of real traffic. TrafficMimic cover traffic is also not detected by a binary classification attack specifically designed to detect TrafficMimic. We evaluate the performance of tunneling with independent cover traffic models and find that they are comparable, and, in some cases, more efficient than generic constant-rate defenses. We then use simulation and analytic modeling to understand the performance of cover traffic tunneling more deeply. We find that we can take measurements from real or simulated traffic with no tunneling and use them to estimate parameters for an accurate analytic model of the performance impact of cover traffic tunneling. Once validated, we use this model to better understand how delay, bandwidth, tunnel slowdown, and stability affect cover traffic tunneling. Finally, we take the insights from our simulation study and develop several biasing techniques that we can use to match the cover traffic to the real traffic while simultaneously bounding external information leakage. We study these bias methods using simulation and evaluate their security using a Bayesian inference attack. We find that we can safely improve performance with biasing while preventing both traffic analysis and defense detection attacks. We then apply these biasing methods to the real TrafficMimic implementation and evaluate it on the Internet. We find that biasing can provide 3-5x improvement in bandwidth for bulk transfers and 2.5-9.5x speedup for Web browsing over tunneling without biasing.
Resumo:
Os controladores de caudal, normalmente implementados em sistemas Supervisory control and data acquisition (SCADA), apresentam uma grande relevância no controlo automático de canais de adução. Para garantir que os controladores de caudal sejam fiáveis em todo o seu domínio de funcionamento (em situações de escoamento com ressalto livre ou submerso e de transição entre escoamentos com ressalto livre e ressalto submerso) foram comparados os resultados dos ensaios experimentais com diferentes métodos de cálculo da vazão em comportas e/ou sobre soleiras. O programa de ensaios foi realizado nos canais laboratorial e experimental da Universidade de Évora. Foram realizados ensaios em comportas planas verticais e em soleiras do tipo Waterways Experiment Station (WES) controladas ou não por comportas planas verticais. Em ambos os casos, foram contempladas as situações de escoamento com ressalto livre e submerso. Os resultados obtidos mostram que: a) para as comportas, o método Rajaratnam e Subramanya (1967a) conduz a bons resultados com um erro percentual médio absoluto MAPE < 1% para o escoamento com ressalto livre e MAPE < 4% para o submerso; a transição entre escoamentos foi identificada corretamente por este método; b) para as soleiras, obtiveram-se bons resultados para o escoamento com ressalto livre para o método USACE (1987), com MAPE < 2%, e para o submerso através do método Alves e Martins (2011), com MAPE < 5%; a transição entre escoamentos pode ser considerada adequada de acordo com a curva experimental de Grace (1963); c) para soleiras controladas por comporta, conseguiram-se bons resultados para o escoamento com ressalto livre recorrendo à equação dos orifícios de pequenas dimensões, com MAPE < 1, 5%, e para o submerso com a equação dos orifícios totalmente submersos com MAPE < 1, 6%; em ambos os casos foi necessária calibração do coeficiente de vazão; a transição entre escoamentos foi adequada pelo método de Grace (1963). Com base nos resultados obtidos, foi possível definir um algoritmo de vazão generalizado para comportas e/ou soleiras que permite a determinação da vazão para as situações de escoamento com ressalto livre e submerso incluindo a transição entre escoamentos; ABSTRACT: Flow controllers, usually implemented in Supervisory Control and Data Acquisition (SCADA) systems, are very important in the automatic control of irrigation canal systems. To ensure that flow controllers are reliable for the entire operating range (free or submerged flow and flow transitions) the experimental results were compared with different methods of flow measurement for gates and/or weirs. The test program was conducted in the laboratory flume and in the automatic canal of the University of ´Evora. Tests were carried in sluice gates and in broad-crested weirs controlled or not by sluice gate. In both cases free and submerged flow conditions were analyzed. The results show that: a) for the sluice gates, the method of Rajaratnam e Subramanya (1967a) leads to good results with a mean absolute percentage error (MAPE) < 1% for free flow and MAPE < 4% for submerged flow. The transition between flows is correctly identified by this method; b) for the uncontrolled weir, good results were obtained for free flow with the method USACE (1987) with MAPE < 2%, and for submerged flow by the method Alves e Martins (2011) with MAPE < 5%. The transition between flows can be accurately defined by the experimental curve of Grace (1963); c) for the controlled weir, good results were achieved for the free flow with the small orifice equation with MAPE < 1.5% and for submerged flow with the submerged orifice equation with MAPE < 1.6%; in both cases the calibration of the discharge coefficient is needed. The transition between flows can be accomplished through Grace (1963) method. Based on the obtained results, it was possible to define a generalized flow algorithm for gates and/or weirs that allows flow determination for free and submerged flow conditions including the transition between flows.
Resumo:
La Ley 34/2006 y su reglamento de desarrollo, aprobado por Real Decreto 775/2011, introdujeron una reforma esencial en el acceso a la profesión de abogado en España, para cuyo ejercicio había bastado tradicionalmente con la Licenciatura de Derecho, pasando a exigir la realización de estudios de postgrado y prácticas externas para lograr una capacitación profesional, a la postre evaluada con una prueba estatal unificada. Tras obtener la verificación de la ANECA en la modalidad de “formación impartida conjuntamente por las Universidades y las Escuelas de práctica jurídica”, el Máster Universitario en Abogacía de la Universidad de Oviedo, con la participación de los colegios profesionales de abogados de Oviedo y de Gijón, se implantó en 2012/2013. Una vez graduados los integrantes de la primera promoción, procede reflexionar sobre la consecución de objetivos y plantear cuantas modificaciones sean necesarias para asegurar mejoras notables en el futuro; propuestas que pueden resultar exportables a otros postgrados de este tipo. Tal es el objetivo del presente trabajo que, desde la experiencia atesorada por su autora como miembro de la Comisión Académica y secretaria de la Comisión de Calidad del citado Máster, analiza la experiencia cubriendo vertientes que van desde la planificación y coordinación docente a la evaluación de competencias.