935 resultados para System test complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work deals with the numerical simulation of air stripping process for the pre-treatment of groundwater used in human consumption. The model established in steady state presents an exponential solution that is used, together with the Tau Method, to get a spectral approach of the solution of the system of partial differential equations associated to the model in transient state.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article discusses the development of an Intelligent Distributed Environmental Decision Support System, built upon the association of a Multi-agent Belief Revision System with a Geographical Information System (GIS). The inherent multidisciplinary features of the involved expertises in the field of environmental management, the need to define clear policies that allow the synthesis of divergent perspectives, its systematic application, and the reduction of the costs and time that result from this integration, are the main reasons that motivate the proposal of this project. This paper is organised in two parts: in the first part we present and discuss the developed Distributed Belief Revision Test-bed — DiBeRT; in the second part we analyse its application to the environmental decision support domain, with special emphasis on the interface with a GIS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide use of antibiotics in aquaculture has led to the emergence of resistant microbial species. It should be avoided/minimized by controlling the amount of drug employed in fish farming. For this purpose, the present work proposes test-strip papers aiming at the detection/semi-quantitative determination of organic drugs by visual comparison of color changes, in a similar analytical procedure to that of pH monitoring by universal pH paper. This is done by establishing suitable chemical changes upon cellulose, attributing the paper the ability to react with the organic drug and to produce a color change. Quantitative data is also enabled by taking a picture and applying a suitable mathematical treatment to the color coordinates given by the HSL system used by windows. As proof of concept, this approach was applied to oxytetracycline (OXY), one of the antibiotics frequently used in aquaculture. A bottom-up modification of paper was established, starting by the reaction of the glucose moieties on the paper with 3-triethoxysilylpropylamine (APTES). The so-formed amine layer allowed binding to a metal ion by coordination chemistry, while the metal ion reacted after with the drug to produce a colored compound. The most suitable metals to carry out such modification were selected by bulk studies, and the several stages of the paper modification were optimized to produce an intense color change against the concentration of the drug. The paper strips were applied to the analysis of spiked environmental water, allowing a quantitative determination for OXY concentrations as low as 30 ng/mL. In general, this work provided a simple, method to screen and discriminate tetracycline drugs, in aquaculture, being a promising tool for local, quick and cheap monitoring of drugs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Oceans - San Diego, 2013

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents an automatic calibration method for a vision based external underwater ground-truth positioning system. These systems are a relevant tool in benchmarking and assessing the quality of research in underwater robotics applications. A stereo vision system can in suitable environments such as test tanks or in clear water conditions provide accurate position with low cost and flexible operation. In this work we present a two step extrinsic camera parameter calibration procedure in order to reduce the setup time and provide accurate results. The proposed method uses a planar homography decomposition in order to determine the relative camera poses and the determination of vanishing points of detected lines in the image to obtain the global pose of the stereo rig in the reference frame. This method was applied to our external vision based ground-truth at the INESC TEC/Robotics test tank. Results are presented in comparison with an precise calibration performed using points obtained from an accurate 3D LIDAR modelling of the environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A liberalização dos mercados de energia elétrica e a crescente integração dos recursos energéticos distribuídos nas redes de distribuição, nomeadamente as unidades de produção distribuída, os sistemas de controlo de cargas através dos programas de demand response, os sistemas de armazenamento e os veículos elétricos, representaram uma evolução no paradigma de operação e gestão dos sistemas elétricos. Este novo paradigma de operação impõe o desenvolvimento de novas metodologias de gestão e controlo que permitam a integração de todas as novas tecnologias de forma eficiente e sustentável. O principal contributo deste trabalho reside no desenvolvimento de metodologias para a gestão de recursos energéticos no contexto de redes inteligentes, que contemplam três horizontes temporais distintos (24 horas, 1 hora e 5 minutos). As metodologias consideram os escalonamentos anteriores assim como as previsões atualizadas de forma a melhorar o desempenho total do sistema e consequentemente aumentar a rentabilidade dos agentes agregadores. As metodologias propostas foram integradas numa ferramenta de simulação, que servirá de apoio à decisão de uma entidade agregadora designada por virtual power player. Ao nível das metodologias desenvolvidas são propostos três algoritmos de gestão distintos, nomeadamente para a segunda (1 hora) e terceira fase (5 minutos) da ferramenta de gestão, diferenciados pela influência que os períodos antecedentes e seguintes têm no período em escalonamento. Outro aspeto relevante apresentado neste documento é o teste e a validação dos modelos propostos numa plataforma de simulação comercial. Para além das metodologias propostas, a aplicação permitiu validar os modelos dos equipamentos considerados, nomeadamente, ao nível das redes de distribuição e dos recursos energéticos distribuidos. Nesta dissertação são apresentados três casos de estudos, cada um com diferentes cenários referentes a cenários de operação futuros. Estes casos de estudos são importantes para verificar a viabilidade da implementação das metodologias e algoritmos propostos. Adicionalmente são apresentadas comparações das metodologias propostas relativamente aos resultados obtidos, complexidade de gestão em ambiente de simulação para as diferentes fases da ferramenta proposta e os benefícios e inconvenientes no uso da ferramenta proposta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tese apresentada como requisito parcial para obtenção do grau de Doutor em Gestão de Informação

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last few years, we have observed an exponential increasing of the information systems, and parking information is one more example of them. The needs of obtaining reliable and updated information of parking slots availability are very important in the goal of traffic reduction. Also parking slot prediction is a new topic that has already started to be applied. San Francisco in America and Santander in Spain are examples of such projects carried out to obtain this kind of information. The aim of this thesis is the study and evaluation of methodologies for parking slot prediction and the integration in a web application, where all kind of users will be able to know the current parking status and also future status according to parking model predictions. The source of the data is ancillary in this work but it needs to be understood anyway to understand the parking behaviour. Actually, there are many modelling techniques used for this purpose such as time series analysis, decision trees, neural networks and clustering. In this work, the author explains the best techniques at this work, analyzes the result and points out the advantages and disadvantages of each one. The model will learn the periodic and seasonal patterns of the parking status behaviour, and with this knowledge it can predict future status values given a date. The data used comes from the Smart Park Ontinyent and it is about parking occupancy status together with timestamps and it is stored in a database. After data acquisition, data analysis and pre-processing was needed for model implementations. The first test done was with the boosting ensemble classifier, employed over a set of decision trees, created with C5.0 algorithm from a set of training samples, to assign a prediction value to each object. In addition to the predictions, this work has got measurements error that indicates the reliability of the outcome predictions being correct. The second test was done using the function fitting seasonal exponential smoothing tbats model. Finally as the last test, it has been tried a model that is actually a combination of the previous two models, just to see the result of this combination. The results were quite good for all of them, having error averages of 6.2, 6.6 and 5.4 in vacancies predictions for the three models respectively. This means from a parking of 47 places a 10% average error in parking slot predictions. This result could be even better with longer data available. In order to make this kind of information visible and reachable from everyone having a device with internet connection, a web application was made for this purpose. Beside the data displaying, this application also offers different functions to improve the task of searching for parking. The new functions, apart from parking prediction, were: - Park distances from user location. It provides all the distances to user current location to the different parks in the city. - Geocoding. The service for matching a literal description or an address to a concrete location. - Geolocation. The service for positioning the user. - Parking list panel. This is not a service neither a function, is just a better visualization and better handling of the information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The catastrophic disruption in the USA financial system in the wake of the financial crisis prompted the Federal Reserve to launch a Quantitative Easing (QE) programme in late 2008. In line with Pesaran and Smith (2014), I use a policy effectiveness test to assess whether this massive asset purchase programme was effective in stimulating the economic activity in the USA. Specifically, I employ an Autoregressive Distributed Lag Model (ARDL), in order to obtain a counterfactual for the USA real GDP growth rate. Using data from 1983Q1 to 2009Q4, the results show that the beneficial effects of QE appear to be weak and rather short-lived. The null hypothesis of policy ineffectiveness is not rejected, which suggests that QE did not have a meaningful impact on output growth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neurological disorders are a major concern in modern societies, with increasing prevalence mainly related with the higher life expectancy. Most of the current available therapeutic options can only control and ameliorate the patients’ symptoms, often be-coming refractory over time. Therapeutic breakthroughs and advances have been hampered by the lack of accurate central nervous system (CNS) models. The develop-ment of these models allows the study of the disease onset/progression mechanisms and the preclinical evaluation of novel therapeutics. This has traditionally relied on genetically engineered animal models that often diverge considerably from the human phenotype (developmentally, anatomically and physiologically) and 2D in vitro cell models, which fail to recapitulate the characteristics of the target tissue (cell-cell and cell-matrix interactions, cell polarity). The in vitro recapitulation of CNS phenotypic and functional features requires the implementation of advanced culture strategies that enable to mimic the in vivo struc-tural and molecular complexity. Models based on differentiation of human neural stem cells (hNSC) in 3D cultures have great potential as complementary tools in preclinical research, bridging the gap between human clinical studies and animal models. This thesis aimed at the development of novel human 3D in vitro CNS models by integrat-ing agitation-based culture systems and a wide array of characterization tools. Neural differentiation of hNSC as 3D neurospheres was explored in Chapter 2. Here, it was demonstrated that human midbrain-derived neural progenitor cells from fetal origin (hmNPC) can generate complex tissue-like structures containing functional dopaminergic neurons, as well as astrocytes and oligodendrocytes. Chapter 3 focused on the development of cellular characterization assays for cell aggregates based on light-sheet fluorescence imaging systems, which resulted in increased spatial resolu-tion both for fixed samples or live imaging. The applicability of the developed human 3D cell model for preclinical research was explored in Chapter 4, evaluating the poten-tial of a viral vector candidate for gene therapy. The efficacy and safety of helper-dependent CAV-2 (hd-CAV-2) for gene delivery in human neurons was evaluated, demonstrating increased neuronal tropism, efficient transgene expression and minimal toxicity. The potential of human 3D in vitro CNS models to mimic brain functions was further addressed in Chapter 5. Exploring the use of 13C-labeled substrates and Nucle-ar Magnetic Resonance (NMR) spectroscopy tools, neural metabolic signatures were evaluated showing lineage-specific metabolic specialization and establishment of neu-ron-astrocytic shuttles upon differentiation. Chapter 6 focused on transferring the knowledge and strategies described in the previous chapters for the implementation of a scalable and robust process for the 3D differentiation of hNSC derived from human induced pluripotent stem cells (hiPSC). Here, software-controlled perfusion stirred-tank bioreactors were used as technological system to sustain cell aggregation and dif-ferentiation. The work developed in this thesis provides practical and versatile new in vitro ap-proaches to model the human brain. Furthermore, the culture strategies described herein can be further extended to other sources of neural phenotypes, including pa-tient-derived hiPSC. The combination of this 3D culture strategy with the implemented characterization methods represents a powerful complementary tool applicable in the drug discovery, toxicology and disease modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless mesh networks present an attractive communication solution for various research and industrial projects. However, in many cases, the appropriate preliminary calculations which allow predicting the network behavior have to be made before the actual deployment. For such purposes, network simulation environments emulating the real network operation are often used. Within this paper, a behavior comparison of real wireless mesh network (based on 802.11s amendment) and the simulated one has been performed. The main objective of this work is to measure performance parameters of a real 802.11s wireless mesh network (average UDP throughput and average one-way delay) and compare the derived results with characteristics of a simulated wireless mesh network created with the NS-3 network simulation tool. Then, the results from both networks are compared and the corresponding conclusion is made. The corresponding results were derived from simulation model and real-worldtest-bed, showing that the behavior of both networks is similar. It confirms that the NS-3 simulation model is accurate and can be used in further research studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Unraveling the effect of selection vs. drift on the evolution of quantitative traits is commonly achieved by one of two methods. Either one contrasts population differentiation estimates for genetic markers and quantitative traits (the Q(st)-F(st) contrast) or multivariate methods are used to study the covariance between sets of traits. In particular, many studies have focused on the genetic variance-covariance matrix (the G matrix). However, both drift and selection can cause changes in G. To understand their joint effects, we recently combined the two methods into a single test (accompanying article by Martin et al.), which we apply here to a network of 16 natural populations of the freshwater snail Galba truncatula. Using this new neutrality test, extended to hierarchical population structures, we studied the multivariate equivalent of the Q(st)-F(st) contrast for several life-history traits of G. truncatula. We found strong evidence of selection acting on multivariate phenotypes. Selection was homogeneous among populations within each habitat and heterogeneous between habitats. We found that the G matrices were relatively stable within each habitat, with proportionality between the among-populations (D) and the within-populations (G) covariance matrices. The effect of habitat heterogeneity is to break this proportionality because of selection for habitat-dependent optima. Individual-based simulations mimicking our empirical system confirmed that these patterns are expected under the selective regime inferred. We show that homogenizing selection can mimic some effect of drift on the G matrix (G and D almost proportional), but that incorporating information from molecular markers (multivariate Q(st)-F(st)) allows disentangling the two effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: To date, there is no quality assurance program that correlates patient outcome to perfusion service provided during cardiopulmonary bypass (CPB). A score was devised, incorporating objective parameters that would reflect the likelihood to influence patient outcome. The purpose was to create a new method for evaluating the quality of care the perfusionist provides during CPB procedures and to deduce whether it predicts patient morbidity and mortality. METHODS: We analysed 295 consecutive elective patients. We chose 10 parameters: fluid balance, blood transfused, Hct, ACT, PaO2, PaCO2, pH, BE, potassium and CPB time. Distribution analysis was performed using the Shapiro-Wilcoxon test. This made up the PerfSCORE and we tried to find a correlation to mortality rate, patient stay in the ICU and length of mechanical ventilation. Univariate analysis (UA) using linear regression was established for each parameter. Statistical significance was established when p < 0.05. Multivariate analysis (MA) was performed with the same parameters. RESULTS: The mean age was 63.8 +/- 12.6 years with 70% males. There were 180 CABG, 88 valves, and 27 combined CABG/valve procedures. The PerfSCORE of 6.6 +/- 2.4 (0-20), mortality of 2.7% (8/295), CPB time 100 +/- 41 min (19-313), ICU stay 52 +/- 62 hrs (7-564) and mechanical ventilation of 10.5 +/- 14.8 hrs (0-564) was calculated. CPB time, fluid balance, PaO2, PerfSCORE and blood transfused were significantly correlated to mortality (UA, p < 0.05). Also, CPB time, blood transfused and PaO2 were parameters predicting mortality (MA, p < 0.01). Only pH was significantly correlated for predicting ICU stay (UA). Ultrafiltration (UF) and CPB time were significantly correlated (UA, p < 0.01) while UF (p < 0.05) was the only parameter predicting mechanical ventilation duration (MA). CONCLUSIONS: CPB time, blood transfused and PaO2 are independent risk factors of mortality. Fluid balance, blood transfusion, PaO2, PerfSCORE and CPB time are independent parameters for predicting morbidity. PerfSCORE is a quality of perfusion measure that objectively quantifies perfusion performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of the work was to realize a high-speed digital data transfer system for RPC muon chambers in the CMS experiment on CERN’s new LHC accelerator. This large scale system took many years and many stages of prototyping to develop, and required the participation of tens of people. The system interfaces to Frontend Boards (FEB) at the 200,000-channel detector and to the trigger and readout electronics in the control room of the experiment. The distance between these two is about 80 metres and the speed required for the optic links was pushing the limits of available technology when the project was started. Here, as in many other aspects of the design, it was assumed that the features of readily available commercial components would develop in the course of the design work, just as they did. By choosing a high speed it was possible to multiplex the data from some the chambers into the same fibres to reduce the number of links needed. Further reduction was achieved by employing zero suppression and data compression, and a total of only 660 optical links were needed. Another requirement, which conflicted somewhat with choosing the components a late as possible was that the design needed to be radiation tolerant to an ionizing dose of 100 Gy and to a have a moderate tolerance to Single Event Effects (SEEs). This required some radiation test campaigns, and eventually led to ASICs being chosen for some of the critical parts. The system was made to be as reconfigurable as possible. The reconfiguration needs to be done from a distance as the electronics is not accessible except for some short and rare service breaks once the accelerator starts running. Therefore reconfigurable logic is extensively used, and the firmware development for the FPGAs constituted a sizable part of the work. Some special techniques needed to be used there too, to achieve the required radiation tolerance. The system has been demonstrated to work in several laboratory and beam tests, and now we are waiting to see it in action when the LHC will start running in the autumn 2008.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "Java Intelligent Tutoring System" (JITS) research project focused on designing, constructing, and determining the effectiveness of an Intelligent Tutoring System for beginner Java programming students at the postsecondary level. The participants in this research were students in the School of Applied Computing and Engineering Sciences at Sheridan College. This research involved consistently gathering input from students and instructors using JITS as it developed. The cyclic process involving designing, developing, testing, and refinement was used for the construction of JITS to ensure that it adequately meets the needs of students and instructors. The second objective in this dissertation determined the effectiveness of learning within this environment. The main findings indicate that JITS is a richly interactive ITS that engages students on Java programming problems. JITS is equipped with a sophisticated personalized feedback mechanism that models and supports each student in his/her learning style. The assessment component involved 2 main quantitative experiments to determine the effectiveness of JITS in terms of student performance. In both experiments it was determined that a statistically significant difference was achieved between the control group and the experimental group (i.e., JITS group). The main effect for Test (i.e., pre- and postiest), F( l , 35) == 119.43,p < .001, was qualified by a Test by Group interaction, F( l , 35) == 4.98,p < .05, and a Test by Time interaction, F( l , 35) == 43.82, p < .001. Similar findings were found for the second experiment; Test by Group interaction revealed F( 1 , 92) == 5.36, p < .025. In both experiments the JITS groups outperformed the corresponding control groups at posttest.