974 resultados para Jaw Fixation Techniques
Resumo:
The aim of the thesis was to design and develop spatially adaptive denoising techniques with edge and feature preservation, for images corrupted with additive white Gaussian noise and SAR images affected with speckle noise. Image denoising is a well researched topic. It has found multifaceted applications in our day to day life. Image denoising based on multi resolution analysis using wavelet transform has received considerable attention in recent years. The directionlet based denoising schemes presented in this thesis are effective in preserving the image specific features like edges and contours in denoising. Scope of this research is still open in areas like further optimization in terms of speed and extension of the techniques to other related areas like colour and video image denoising. Such studies would further augment the practical use of these techniques.
Resumo:
We present a new algorithm called TITANIC for computing concept lattices. It is based on data mining techniques for computing frequent itemsets. The algorithm is experimentally evaluated and compared with B. Ganter's Next-Closure algorithm.
Resumo:
The surge in the urban population evident in most developing countries is a worldwide phenomenon, and often the result of drought, conflicts, poverty and the lack of education opportunities. In parallel with the growth of the cities is the growing need for food which leads to the burgeoning expansion of urban and peri-urban agriculture (UPA). In this context, urban agriculture (UA) contributes significantly to supplying local markets with both vegetable and animal produce. As an income generating activity, UA also contributes to the livelihoods of poor urban dwellers. In order to evaluate the nutrient status of urban soils in relation to garden management, this study assessed nutrient fluxes (inputs and outputs) in gardens on urban Gerif soils on the banks of the River Nile in Khartoum, the capital city of Sudan. To achieve this objective, a preliminary baseline survey was carried out to describe the structure of the existing garden systems. In cooperation with the author of another PhD thesis (Ms. Ishtiag Abdalla), alternative uses of cow dung in brick making kilns in urban Khartoum were assessed; and the socio-economic criteria of the brick kiln owners or agents, economical and plant nutritional value of animal dung and the gaseous emission related to brick making activities were assessed. A total of 40 household heads were interviewed using a semi-structured questionnaire to collect information on demographic, socio-economic and migratory characteristics of the household members, the gardening systems used and the problems encountered in urban gardening. Based on the results of this survey, gardens were divided into three groups: mixed vegetable-fodder gardens, mixed vegetable-subsistence livestock gardens and pure vegetable gardens. The results revealed that UA is the exclusive domain of men, 80% of them non-native to Khartoum. The harvested produce in all gardens was market oriented and represented the main source of income for 83% of the gardeners. Fast growing leafy vegetables such as Jew’s mallow (Corchorous olitorius L.), purslane (Portulaca oleracea L.) and rocket (Eruca sativa Mill.) were the dominant cultivated species. Most of the gardens (95%) were continuously cultivated throughout the year without any fallow period, unless they were flooded. Gardeners were not generally aware of the importance of crop diversity, which may help them overcome the strongly fluctuating market prices for their produce and thereby strengthen the contributions of UA to the overall productivity of the city. To measure nutrient fluxes, four gardens were selected and their nutrients inputs and outputs flows were monitored. In each garden, all plots were monitored for quantification of nutrient inputs and outputs. To determine soil chemical fertility parameters in each of the studied gardens, soil samples were taken from three selected plots at the beginning of the study in October 2007 (gardens L1, L2 and H1) and in April 2008 (garden H2) and at the end of the study period in March 2010. Additional soil sampling occurred in May 2009 to assess changes in the soil nutrient status after the River Nile flood of 2008 had receded. Samples of rain and irrigation water (river and well-water) were analyzed for nitrogen (N), phosphorus (P), potassium (K) and carbon (C) content to determine their nutrient inputs. Catchment traps were installed to quantify the sediment yield from the River Nile flood. To quantify the nutrient inputs of sediments, samples were analyzed for N, P, K and organic carbon (Corg) content, cation exchange capacity (CEC) and the particle size distribution. The total nutrient inputs were calculated by multiplying the sediment nutrient content by total sediment deposits on individual gardens. Nutrient output in the form of harvested yield was quantified at harvest of each crop. Plant samples from each field were dried, and analyzed for their N, P, K and Corg content. Cumulative leaching losses of mineral N and P were estimated in a single plot in garden L1 from December 1st 2008 to July 1st 2009 using 12 ion exchange resins cartridges. Nutrients were extracted and analyzed for nitrate (NO3--N), ammonium (NH4+-N) and phosphate PO4-3-P. Changes in soil nutrient balance were assessed as inputs minus outputs. The results showed that across gardens, soil N and P concentrations increased from 2007 to 2009, while particle size distribution remained unchanged. Sediment loads and their respective contents of N, P and Corg decreased significantly (P < 0.05) from the gardens of the downstream lowlands (L1 and L2) to the gardens of the upstream highlands (H1 and H2). No significant difference was found in K deposits. None of the gardens received organic fertilizers and the only mineral fertilizer applied was urea (46-0-0). This equaled 29, 30, 54, and 67% of total N inputs to gardens L1, L2, H1, and H2, respectively. Sediment deposits of the River Nile floods contributed on average 67, 94, 6 and 42% to the total N, P, K and C inputs in lowland gardens and 33, 86, 4 and 37% of total N, P, K and C inputs in highland gardens. Irrigation water and rainfall contributed substantially to K inputs representing 96, 92, 94 and 96% of total K influxes in garden L1, L2, H1 and H2, respectively. Following the same order, total annual DM yields in the gardens were 26, 18, 16 and 1.8 t ha-1. Annual leaching losses were estimated to be 0.02 kg NH4+-N ha-1 (SE = 0.004), 0.03 kg NO3--N ha-1 (SE = 0.002) and 0.005 kg PO4-3-P ha-1 (SE = 0.0007). Differences between nutrient inputs and outputs indicated negative nutrient balances for P and K and positive balances of N and C for all gardens. The negative balances in P and K call for adoptions of new agricultural techniques such as regular manure additions or mulching which may enhance the soil organic matter status. A quantification of fluxes not measured in our study such as N2-fixation, dry deposition and gaseous emissions of C and N would be necessary to comprehensively assess the sustainability of these intensive gardening systems. The second part of the survey dealt with the brick making kilns. A total of 50 brick kiln owners/or agents were interviewed from July to August 2009, using a semi-structured questionnaire. The data collected included general information such as age, family size, education, land ownership, number of kilns managed and/or owned, number of months that kilns were in operation, quantity of inputs (cow dung and fuel wood) used, prices of inputs and products across the production season. Information related to the share value of the land on which the kilns were built and annual income for urban farmers and annual returns from dung for the animal raisers was also collected. Using descriptive statistics, budget calculation and Gini coefficient, the results indicated that renting the land to brick making kilns yields a 5-fold higher return than the rent for agriculture. Gini coefficient showed that the kiln owners had a more equal income distribution compared to farmers. To estimate emission of greenhouse gases (GHGs) and losses of N, P, K, Corg and DM from cow dung when used in brick making, samples of cow dung (loose and compacted) were collected from different kilns and analyzed for their N, P, K and Corg content. The procedure modified by the Intergovernmental Panel on Climate Change (IPCC, 1994) was used to estimate the gaseous emissions of cow dung and fuel wood. The amount of deforested wood was estimated according to the default values for wood density given by Dixon et al. (1991) and the expansion ratio for branches and small trees given by Brown et al. (1989). The data showed the monetary value of added N and P from cow dung was lower than for mineral fertilizers. Annual consumption of compacted dung (381 t DM) as biomass fuel by far exceeded the consumption of fuel wood (36 t DM). Gaseous emissions from cow dung and fuel wood were dominated by CO2, CO and CH4. Considering that Gerif land in urban Khartoum supports a multifunctional land use system, efficient use of natural resources (forest, dung, land and water) will enhance the sustainability of the UA and brick making activities. Adoption of new kilns with higher energy efficiency will reduce the amount of biomass fuels (cow dung and wood) used the amount of GHGs emitted and the threat to the few remaining forests.
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
Unidad didáctica de Inglés elaborada a partir de un tema transversal: el racismo. Este tema crea un clima positivo de respeto y colaboración que facilita el trabajo en equipo. Se resalta el papel que la lengua extranjera tiene como instrumento de comunicación y de cooperación entre los distintos paises y pueblos. La unidad cubre las cuatro detrezas comunicativas: listening, speaking, reading y writing para los niveles superiores de la Enseñanza Secundaria, a través de materiales audiovisuales reales, no creados por el profesorado para la ocasión (publicación audiovisual Speak-up, revista TIME, etc.).
Resumo:
Signalling off-chip requires significant current. As a result, a chip's power-supply current changes drastically during certain output-bus transitions. These current fluctuations cause a voltage drop between the chip and circuit board due to the parasitic inductance of the power-supply package leads. Digital designers often go to great lengths to reduce this "transmitted" noise. Cray, for instance, carefully balances output signals using a technique called differential signalling to guarantee a chip has constant output current. Transmitted-noise reduction costs Cray a factor of two in output pins and wires. Coding achieves similar results at smaller costs.
Resumo:
This paper presents an image-based rendering system using algebraic relations between different views of an object. The system uses pictures of an object taken from known positions. Given three such images it can generate "virtual'' ones as the object would look from any position near the ones that the two input images were taken from. The extrapolation from the example images can be up to about 60 degrees of rotation. The system is based on the trilinear constraints that bind any three view so fan object. As a side result, we propose two new methods for camera calibration. We developed and used one of them. We implemented the system and tested it on real images of objects and faces. We also show experimentally that even when only two images taken from unknown positions are given, the system can be used to render the object from other view points as long as we have a good estimate of the internal parameters of the camera used and we are able to find good correspondence between the example images. In addition, we present the relation between these algebraic constraints and a factorization method for shape and motion estimation. As a result we propose a method for motion estimation in the special case of orthographic projection.
Resumo:
The main instrument used in psychological measurement is the self-report questionnaire. One of its major drawbacks however is its susceptibility to response biases. A known strategy to control these biases has been the use of so-called ipsative items. Ipsative items are items that require the respondent to make between-scale comparisons within each item. The selected option determines to which scale the weight of the answer is attributed. Consequently in questionnaires only consisting of ipsative items every respondent is allotted an equal amount, i.e. the total score, that each can distribute differently over the scales. Therefore this type of response format yields data that can be considered compositional from its inception. Methodological oriented psychologists have heavily criticized this type of item format, since the resulting data is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians have kept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate both positions and addresses the similarities and differences between the two data collection methods. The ultimate objective is to formulate a guideline when to use which type of item format. The comparison is based on data obtained with both an ipsative and normative version of three psychological questionnaires, which were administered to 502 first-year students in psychology according to a balanced within-subjects design. Previous research only compared the direct ipsative scale scores with the derived ipsative scale scores. The use of compositional data analysis techniques also enables one to compare derived normative score ratios with direct normative score ratios. The addition of the second comparison not only offers the advantage of a better-balanced research strategy. In principle it also allows for parametric testing in the evaluation
Resumo:
In 2000 the European Statistical Office published the guidelines for developing the Harmonized European Time Use Surveys system. Under such a unified framework, the first Time Use Survey of national scope was conducted in Spain during 2002– 03. The aim of these surveys is to understand human behavior and the lifestyle of people. Time allocation data are of compositional nature in origin, that is, they are subject to non-negativity and constant-sum constraints. Thus, standard multivariate techniques cannot be directly applied to analyze them. The goal of this work is to identify homogeneous Spanish Autonomous Communities with regard to the typical activity pattern of their respective populations. To this end, fuzzy clustering approach is followed. Rather than the hard partitioning of classical clustering, where objects are allocated to only a single group, fuzzy method identify overlapping groups of objects by allowing them to belong to more than one group. Concretely, the probabilistic fuzzy c-means algorithm is conveniently adapted to deal with the Spanish Time Use Survey microdata. As a result, a map distinguishing Autonomous Communities with similar activity pattern is drawn. Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance
Resumo:
In order to obtain a high-resolution Pleistocene stratigraphy, eleven continuously cored boreholes, 100 to 220m deep were drilled in the northern part of the Po Plain by Regione Lombardia in the last five years. Quantitative provenance analysis (QPA, Weltje and von Eynatten, 2004) of Pleistocene sands was carried out by using multivariate statistical analysis (principal component analysis, PCA, and similarity analysis) on an integrated data set, including high-resolution bulk petrography and heavy-mineral analyses on Pleistocene sands and of 250 major and minor modern rivers draining the southern flank of the Alps from West to East (Garzanti et al, 2004; 2006). Prior to the onset of major Alpine glaciations, metamorphic and quartzofeldspathic detritus from the Western and Central Alps was carried from the axial belt to the Po basin longitudinally parallel to the SouthAlpine belt by a trunk river (Vezzoli and Garzanti, 2008). This scenario rapidly changed during the marine isotope stage 22 (0.87 Ma), with the onset of the first major Pleistocene glaciation in the Alps (Muttoni et al, 2003). PCA and similarity analysis from core samples show that the longitudinal trunk river at this time was shifted southward by the rapid southward and westward progradation of transverse alluvial river systems fed from the Central and Southern Alps. Sediments were transported southward by braided river systems as well as glacial sediments transported by Alpine valley glaciers invaded the alluvial plain. Kew words: Detrital modes; Modern sands; Provenance; Principal Components Analysis; Similarity, Canberra Distance; palaeodrainage
Resumo:
Often practical performance of analytical redundancy for fault detection and diagnosis is decreased by uncertainties prevailing not only in the system model, but also in the measurements. In this paper, the problem of fault detection is stated as a constraint satisfaction problem over continuous domains with a big number of variables and constraints. This problem can be solved using modal interval analysis and consistency techniques. Consistency techniques are then shown to be particularly efficient to check the consistency of the analytical redundancy relations (ARRs), dealing with uncertain measurements and parameters. Through the work presented in this paper, it can be observed that consistency techniques can be used to increase the performance of a robust fault detection tool, which is based on interval arithmetic. The proposed method is illustrated using a nonlinear dynamic model of a hydraulic system
Resumo:
This work provides a general description of the multi sensor data fusion concept, along with a new classification of currently used sensor fusion techniques for unmanned underwater vehicles (UUV). Unlike previous proposals that focus the classification on the sensors involved in the fusion, we propose a synthetic approach that is focused on the techniques involved in the fusion and their applications in UUV navigation. We believe that our approach is better oriented towards the development of sensor fusion systems, since a sensor fusion architecture should be first of all focused on its goals and then on the fused sensors
Resumo:
Obtaining automatic 3D profile of objects is one of the most important issues in computer vision. With this information, a large number of applications become feasible: from visual inspection of industrial parts to 3D reconstruction of the environment for mobile robots. In order to achieve 3D data, range finders can be used. Coded structured light approach is one of the most widely used techniques to retrieve 3D information of an unknown surface. An overview of the existing techniques as well as a new classification of patterns for structured light sensors is presented. This kind of systems belong to the group of active triangulation method, which are based on projecting a light pattern and imaging the illuminated scene from one or more points of view. Since the patterns are coded, correspondences between points of the image(s) and points of the projected pattern can be easily found. Once correspondences are found, a classical triangulation strategy between camera(s) and projector device leads to the reconstruction of the surface. Advantages and constraints of the different patterns are discussed
Resumo:
The absolute necessity of obtaining 3D information of structured and unknown environments in autonomous navigation reduce considerably the set of sensors that can be used. The necessity to know, at each time, the position of the mobile robot with respect to the scene is indispensable. Furthermore, this information must be obtained in the least computing time. Stereo vision is an attractive and widely used method, but, it is rather limited to make fast 3D surface maps, due to the correspondence problem. The spatial and temporal correspondence among images can be alleviated using a method based on structured light. This relationship can be directly found codifying the projected light; then each imaged region of the projected pattern carries the needed information to solve the correspondence problem. We present the most significant techniques, used in recent years, concerning the coded structured light method