991 resultados para Tool path computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Illicit drug analyses usually focus on the identification and quantitation of questioned material to support the judicial process. In parallel, more and more laboratories develop physical and chemical profiling methods in a forensic intelligence perspective. The analysis of large databases resulting from this approach enables not only to draw tactical and operational intelligence, but may also contribute to the strategic overview of drugs markets. In Western Switzerland, the chemical analysis of illicit drug seizures is centralised in a laboratory hosted by the University of Lausanne. For over 8 years, this laboratory has analysed 5875 cocaine and 2728 heroin specimens, coming from respectively 1138 and 614 seizures operated by police and border guards or customs. Chemical (major and minor alkaloids, purity, cutting agents, chemical class), physical (packaging and appearance) as well as circumstantial (criminal case number, mass of drug seized, date and place of seizure) information are collated in a dedicated database for each specimen. The study capitalises on this extended database and defines several indicators to characterise the structure of drugs markets, to follow-up on their evolution and to compare cocaine and heroin markets. Relational, spatial, temporal and quantitative analyses of data reveal the emergence and importance of distribution networks. They enable to evaluate the cross-jurisdictional character of drug trafficking and the observation time of drug batches, as well as the quantity of drugs entering the market every year. Results highlight the stable nature of drugs markets over the years despite the very dynamic flows of distribution and consumption. This research work illustrates how the systematic analysis of forensic data may elicit knowledge on criminal activities at a strategic level. In combination with information from other sources, such knowledge can help to devise intelligence-based preventive and repressive measures and to discuss the impact of countermeasures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis addresses the problem of computing the minimal and maximal diameter of the Cayley graph of Coxeter groups. We first present and assert relevant parts of polytope theory and related Coxeter theory. After this, a method of contracting the orthogonal projections of a polytope from Rd onto R2 and R3, d ¸ 3 is presented. This method is the Equality Set Projection algorithm that requires a constant number of linearprogramming problems per facet of the projection in the absence of degeneracy. The ESP algorithm allows us to compute also projected geometric diameters of high-dimensional polytopes. A representation set of projected polytopes is presented to illustrate the methods adopted in this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

COD discharges out of processes have increased in line with elevating brightness demands for mechanical pulp and papers. The share of lignin-like substances in COD discharges is on average 75%. In this thesis, a plant dynamic model was created and validated as a means to predict COD loading and discharges out of a mill. The assays were carried out in one paper mill integrate producing mechanical printing papers. The objective in the modeling of plant dynamics was to predict day averages of COD load and discharges out of mills. This means that online data, like 1) the level of large storage towers of pulp and white water 2) pulp dosages, 3) production rates and 4) internal white water flows and discharges were used to create transients into the balances of solids and white water, referred to as “plant dynamics”. A conversion coefficient was verified between TOC and COD. The conversion coefficient was used for predicting the flows from TOC to COD to the waste water treatment plant. The COD load was modeled with similar uncertainty as in reference TOC sampling. The water balance of waste water treatment was validated by the reference concentration of COD. The difference of COD predictions against references was within the same deviation of TOC-predictions. The modeled yield losses and retention values of TOC in pulping and bleaching processes and the modeled fixing of colloidal TOC to solids between the pulping plant and the aeration basin in the waste water treatment plant were similar to references presented in literature. The valid water balances of the waste water treatment plant and the reduction model of lignin-like substances produced a valid prediction of COD discharges out of the mill. A 30% increase in the release of lignin-like substances in the form of production problems was observed in pulping and bleaching processes. The same increase was observed in COD discharges out of waste water treatment. In the prediction of annual COD discharge, it was noticed that the reduction of lignin has a wide deviation from year to year and from one mill to another. This made it difficult to compare the parameters of COD discharges validated in plant dynamic simulation with another mill producing mechanical printing papers. However, a trend of moving from unbleached towards high-brightness TMP in COD discharges was valid.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laser scanning is becoming an increasingly popular method for measuring 3D objects in industrial design. Laser scanners produce a cloud of 3D points. For CAD software to be able to use such data, however, this point cloud needs to be turned into a vector format. A popular way to do this is to triangulate the assumed surface of the point cloud using alpha shapes. Alpha shapes start from the convex hull of the point cloud and gradually refine it towards the true surface of the object. Often it is nontrivial to decide when to stop this refinement. One criterion for this is to do so when the homology of the object stops changing. This is known as the persistent homology of the object. The goal of this thesis is to develop a way to compute the homology of a given point cloud when processed with alpha shapes, and to infer from it when the persistent homology has been achieved. Practically, the computation of such a characteristic of the target might be applied to power line tower span analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coverage Path Planning (CPP) is the task of determining a path that passes over all points of an area or volume of interest while avoiding obstacles. This task is integral to many robotic applications, such as vacuum cleaning robots, painter robots, autonomous underwater vehicles creating image mosaics, demining robots, lawn mowers, automated harvesters, window cleaners and inspection of complex structures, just to name a few. A considerable body of research has addressed the CPP problem. However, no updated surveys on CPP reflecting recent advances in the field have been presented in the past ten years. In this paper, we present a review of the most successful CPP methods, focusing on the achievements made in the past decade. Furthermore, we discuss reported field applications of the described CPP methods. This work aims to become a starting point for researchers who are initiating their endeavors in CPP. Likewise, this work aims to present a comprehensive review of the recent breakthroughs in the field, providing links to the most interesting and successful works

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. Recently, significant advances have been made in the early diagnosis of Alzheimer’s disease from EEG. However, choosing suitable measures is a challenging task. Among other measures, frequency Relative Power and loss of complexity have been used with promising results. In the present study we investigate the early diagnosis of AD using synchrony measures and frequency Relative Power on EEG signals, examining the changes found in different frequency ranges. Approach. We first explore the use of a single feature for computing the classification rate, looking for the best frequency range. Then, we present a multiple feature classification system that outperforms all previous results using a feature selection strategy. These two approaches are tested in two different databases, one containing MCI and healthy subjects (patients age: 71.9 ± 10.2, healthy subjects age: 71.7 ± 8.3), and the other containing Mild AD and healthy subjects (patients age: 77.6 ± 10.0; healthy subjects age: 69.4± 11.5). Main Results. Using a single feature to compute classification rates we achieve a performance of 78.33% for the MCI data set and of 97.56 % for Mild AD. Results are clearly improved using the multiple feature classification, where a classification rate of 95% is found for the MCI data set using 11 features, and 100% for the Mild AD data set using 4 features. Significance. The new features selection method described in this work may be a reliable tool that could help to design a realistic system that does not require prior knowledge of a patient's status. With that aim, we explore the standardization of features for MCI and Mild AD data sets with promising results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cognitive neuroscientists have discovered various experimental setups that suggest that our body representation is surprisingly flexible, where the brain can easily be tricked into the illusion that a rubber hand is your hand or that a manikin body is your body. These multisensory illusions work well in immersive virtual reality (IVR). What is even more surprising is that such embodiment induces perceptual, attitudinal and behavioural changes that are concomitant with the displayed body type. Here we outline some recent findings in this field, and suggest that this offers a powerful tool for neuroscience, psychology and a new path for IVR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Controlling the quality variables (such as basis weight, moisture etc.) is a vital part of making top quality paper or board. In this thesis, an advanced data assimilation tool is applied to the quality control system (QCS) of a paper or board machine. The functionality of the QCS is based on quality observations that are measured with a traversing scanner making a zigzag path. The basic idea is the following: The measured quality variable has to be separated into its machine direction (MD) and cross direction (CD) variations due to the fact that the QCS works separately in MD and CD. Traditionally this is done simply by assuming one scan of the zigzag path to be the CD profile and its mean value to be one point of the MD trend. In this thesis, a more advanced method is introduced. The fundamental idea is to use the signals’ frequency components to represent the variation in both CD and MD. To be able to get to the frequency domain, the Fourier transform is utilized. The frequency domain, that is, the Fourier components are then used as a state vector in a Kalman filter. The Kalman filter is a widely used data assimilation tool to combine noisy observations with a model. The observations here refer to the quality measurements and the model to the Fourier frequency components. By implementing the two dimensional Fourier transform into the Kalman filter, we get an advanced tool for the separation of CD and MD components in total variation or, to be more general, for data assimilation. A piece of a paper roll is analyzed and this tool is applied to model the dataset. As a result, it is clear that the Kalman filter algorithm is able to reconstruct the main features of the dataset from a zigzag path. Although the results are made with a very short sample of paper roll, it seems that this method has great potential to be used later on as a part of the quality control system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nonnative brook trout Salvelinus fontinalis are abundant in Pine Creek and its main tributary, Bogard Spring Creek, California. These creeks historically provided the most spawning and rearing habitat for endemic Eagle Lake rainbow trout Oncorhynchus mykiss aquilarum. Three-pass electrofishing removal was conducted in 2007–2009 over the entire 2.8-km length of Bogard Spring Creek to determine whether brook trout removal was a feasible restoration tool and to document the life history characteristics of brook trout in a California meadow stream. After the first 2 years of removal, brook trout density and biomass were severely reduced from 15,803 to 1,192 fish/ha and from 277 to 31 kg/ha, respectively. Average removal efficiency was 92–97%, and most of the remaining fish were removed in the third year. The lack of a decrease in age-0 brook trout abundance between 2007 and 2008 after the removal of more than 4,000 adults in 2007 suggests compensatory reproduction of mature fish that survived and higher survival of age-0 fish. However, recruitment was greatly reduced after 2 years of removal and is likely to be even more depressed after the third year of removal assuming that immigration of fish from outside the creek continues to be minimal. Brook trout condition, growth, and fecundity indicated a stunted population at the start of the study, but all three features increased significantly every year, demonstrating compensatory effects. Although highly labor intensive, the use of electrofishing to eradicate brook trout may be feasible in Bogard Spring Creek and similar small streams if removal and monitoring are continued annually and if other control measures (e.g., construction of barriers) are implemented. Our evidence shows that if brook trout control measures continue and if only Eagle Lake rainbow trout are allowed access to the creek, then a self-sustaining population ofEagle Lake rainbow trout can become reestablished

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among increasingly used pharmaceutical products, β-blockers have been commonly reported at low concentrations in rivers and littoral waters of Europe and North America. Little is known about the toxicity of these chemicals in freshwater ecosystems while their presence may lead to chronic pollution. Hence, in this study the acute toxicity of 3 β-blockers: metoprolol, propranolol and atenolol on fluvial biofilms was assessed by using several biomarkers. Some were indicative of potential alterations in biofilm algae (photosynthetic efficiency), and others in biofilm bacteria (peptidase activity, bacterial mortality). Propranolol was the most toxic β-blocker, mostly affecting the algal photosynthetic process. The exposure to 531 μg/L of propranolol caused 85% of inhibition of photosynthesis after 24 h. Metoprolol was particularly toxic for bacteria. Though estimated No-Effect Concentrations (NEC) were similar to environmental concentrations, higher concentrations of the toxic (503 μg/L metoprolol) caused an increase of 50% in bacterial mortality. Atenolol was the least toxic of the three tested β-blockers. Effects superior to 50% were only observed at very high concentration (707 mg/L). Higher toxicity of metoprolol and propranolol might be due to better absorption within biofilms of these two chemicals. Since β-blockers are mainly found in mixtures in rivers, their differential toxicity could have potential relevant consequences on the interactions between algae and bacteria within river biofilms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the techniques used to detect faults in dynamic systems is analytical redundancy. An important difficulty in applying this technique to real systems is dealing with the uncertainties associated with the system itself and with the measurements. In this paper, this uncertainty is taken into account by the use of intervals for the parameters of the model and for the measurements. The method that is proposed in this paper checks the consistency between the system's behavior, obtained from the measurements, and the model's behavior; if they are inconsistent, then there is a fault. The problem of detecting faults is stated as a quantified real constraint satisfaction problem, which can be solved using the modal interval analysis (MIA). MIA is used because it provides powerful tools to extend the calculations over real functions to intervals. To improve the results of the detection of the faults, the simultaneous use of several sliding time windows is proposed. The result of implementing this method is semiqualitative tracking (SQualTrack), a fault-detection tool that is robust in the sense that it does not generate false alarms, i.e., if there are false alarms, they indicate either that the interval model does not represent the system adequately or that the interval measurements do not represent the true values of the variables adequately. SQualTrack is currently being used to detect faults in real processes. Some of these applications using real data have been developed within the European project advanced decision support system for chemical/petrochemical manufacturing processes and are also described in this paper

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present article comes from a doctoral thesis that turns on digital learner portfolio, which is an innovating methodology from the perspective of European Higher Education Area. First, the educative concept of eportfolio is described in the sense of its procedure and its structure, by means of the technological support of a platform of virtual campus. Second, it is shown the pedagogical model of an eportfolio that adapts subjects with an instrumental character to one organization based on tasks and reflections. This design of virtual learning environment is based on a teaching- learning methodology sustained in the activity of the student, which tries to give support to the management of his or her own process of learning and assessment. Finally, the article illustrates the experience of implementation of the first digital learner portfolios in the University of Barcelona and the Autonomous University of Barcelona, with the objective of reflecting about the pedagogical consequences that this assessment model with technological support has in a traditional higher education institution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L’objectiu del present TFM és explorar les possibilitats del programa matemàtic MATLAB i la seva eina Entorn de Disseny d’Interfícies Gràfiques d’Usuari (GUIDE), desenvolupant un programa d’anàlisi d’imatges de provetes metal·logràfiques que es pugui utilitzar per a realitzar pràctiques de laboratori de l’assignatura Tecnologia de Materials de la titulació de Grau en Enginyeria Mecatrònica que s’imparteix a la Universitat de Vic. Les àrees d’interès del treball són la Instrumentació Virtual, la programació MATLAB i les tècniques d’anàlisi d’imatges metal·logràfiques. En la memòria es posa un èmfasi especial en el disseny de la interfície i dels procediments per a efectuar les mesures. El resultat final és un programa que satisfà tots els requeriments que s’havien imposat en la proposta inicial. La interfície del programa és clara i neta, destinant molt espai a la imatge que s’analitza. L’estructura i disposició dels menús i dels comandaments ajuda a que la utilització del programa sigui fàcil i intuïtiva. El programa s’ha estructurat de manera que sigui fàcilment ampliable amb altres rutines de mesura, o amb l’automatització de les rutines existents. Al tractar-se d’un programa que funciona com un instrument de mesura, es dedica un capítol sencer de la memòria a mostrar el procediment de càlcul dels errors que s’ocasionen durant la seva utilització, amb la finalitat de conèixer el seu ordre de magnitud, i de saber-los calcular de nou en cas que variïn les condicions d’utilització. Pel que fa referència a la programació, malgrat que MATLAB no sigui un entorn de programació clàssic, sí que incorpora eines que permeten fer aplicacions no massa complexes, i orientades bàsicament a gràfics o a imatges. L’eina GUIDE simplifica la realització de la interfície d’usuari, malgrat que presenta problemes per tractar dissenys una mica complexos. Per altra banda, el codi generat per GUIDE no és accessible, cosa que no permet modificar manualment la interfície en aquells casos en els que GUIDE té problemes. Malgrat aquests petits problemes, la potència de càlcul de MATLAB compensa sobradament aquestes deficiències.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Lautanauhatekniikka on monipuolinen menetelmä esimerkiksi kuvioitujen nauhojen kutomiseen, mutta uusien kuvioaiheiden suunnittelu, tai aloittelijalle jo valmiiden ohjeettomien kuviomallien jäljittely, voi helposti käydä työlääksi menetelmän ominaispiirteiden johdosta. Tämän työn tavoitteena oli kehittää ohjelmallinen työkalu auttamaan näissä ongelmissa automatisoimalla kudontaohjeen etsintä käyttäjän laatimalle tavoitekuviolle. Ratkaisumenetelmän perustaksi valittiin geneettinen algoritmi, minkä johdosta työn keskeisintutkimusongelma oli kartoittaa algoritmin perusoperaatioiden parametrien ja tavoitekuvion kompleksisuuden keskinäisiä riippuvuuksia riittävästi toimivien arvosuositusten antamiseen ohjelman tulevassa käytännön käytössä. Työssä ei kehitetty sovellusalueeseen mukautettuja evoluutiooperaatioita, vaan keskityttiin luomaan hyvin tunnetuista elementeistä perusta, jota voi myöhemmin kehittää eteenpäin.