862 resultados para multi-system
Resumo:
The present paper describes a system for the construction of visual maps ("mosaics") and motion estimation for a set of AUVs (Autonomous Underwater Vehicles). Robots are equipped with down-looking camera which is used to estimate their motion with respect to the seafloor and built an online mosaic. As the mosaic increases in size, a systematic bias is introduced in its alignment, resulting in an erroneous output. The theoretical concepts associated with the use of an Augmented State Kalman Filter (ASKF) were applied to optimally estimate both visual map and the fleet position.
Resumo:
The Internet of things (IoT) is still in its infancy and has attracted much interest in many industrial sectors including medical fields, logistics tracking, smart cities and automobiles. However, as a paradigm, it is susceptible to a range of significant intrusion threats. This paper presents a threat analysis of the IoT and uses an Artificial Neural Network (ANN) to combat these threats. A multi-level perceptron, a type of supervised ANN, is trained using internet packet traces, then is assessed on its ability to thwart Distributed Denial of Service (DDoS/DoS) attacks. This paper focuses on the classification of normal and threat patterns on an IoT Network. The ANN procedure is validated against a simulated IoT network. The experimental results demonstrate 99.4% accuracy and can successfully detect various DDoS/DoS attacks.
Resumo:
The selection of a set of requirements between all the requirements previously defined by customers is an important process, repeated at the beginning of each development step when an incremental or agile software development approach is adopted. The set of selected requirements will be developed during the actual iteration. This selection problem can be reformulated as a search problem, allowing its treatment with metaheuristic optimization techniques. This paper studies how to apply Ant Colony Optimization algorithms to select requirements. First, we describe this problem formally extending an earlier version of the problem, and introduce a method based on Ant Colony System to find a variety of efficient solutions. The performance achieved by the Ant Colony System is compared with that of Greedy Randomized Adaptive Search Procedure and Non-dominated Sorting Genetic Algorithm, by means of computational experiments carried out on two instances of the problem constructed from data provided by the experts.
Resumo:
This paper focuses on teaching boys, male teachers and the question of gendered pedagogies in neoliberal and postfeminist times of the proliferation of new forms of capitalism, multi-mediated technologies and the influence of globalization. It illustrates how a politics of re-masculinization and its reconstitution needs to be understood as set against changing economic and social conditions in which gender equity comes to be re-focused on boys as the ‚new disadvantaged‘. This re-framing of gender equity, it is argued, has been fuelled by both a media-inspired backlash discourse about ‚failing boys‘ and a neo-positivist emphasis on numbers derived primarily from standardized testing regimes at both global and national levels. A media-focused analysis of the proliferation of discourses about ‚failing boys‘ vis-a-vis the problem of encroaching feminization in the school system is provided to illuminate how certain truths about the influence of male teachers come to define how the terms of ensuring gender equity are delimited and reduced to a question of gendered pedagogies as grounded in sexed bodies. Historical accounts of the feminization of teaching in the North American context are also provided as a basis for building a more informed understanding of the present, particularly as it relates to the contextualization of policy articulation and enactment regarding the problem of teaching boys. In light of such historically informed and critical media analysis, it is argued that what is needed is a more informed, evidenced based policy articulation of the problem of teaching boys and a more gender sensitive reflection on the politics of masculinities in postfeminist times. (DIPF/Orig.)
Resumo:
A multi-residue gas chromatography-mass spectrometry method was developed in order to evaluate the presence of 39 pesticides of different chemical families (organophosphorus, triazines, imidazole, organochlorine), as well as some of their transformation products, in surface water samples from Ria de Aveiro. Ria de Aveiro is an estuarine coastal lagoon, located in the northern west region of Portugal, which receives inputs from agriculture, urban and industrial activities. The analytical method was developed and validated according international guidelines and showed good linearity, with correlation coefficients higher than 0.9949 for all compounds, adequate precision and accuracy, and high sensitivity. Pesticides were chosen from the priority pollutants list of the Directive 2008/105/EC of the European Parliament and of the Council (on environmental quality standards in the field of water policy), or were selected due their common use in agricultural practices. Some of these 39 pesticides are, or are suspected to be, endocrine disruptor compounds (EDCs), being capable of altering the endocrine system of wildlife and humans, causing form malfunction and ultimately health problems. Even those pesticides which are not EDCs, are known to be awfully toxic and have a recognised impact in human health. The aquatic environment is particularly susceptible to pollution due to intentional and accidental release of chemicals to water [3]. Pesticide contamination of surface water is a national issue as it is often used as drinking water. This concern is especially important in rural agricultural areas where population uses small private water supplies, regularly without any laboratory surveillance. The study was performed in seven sampling points and the results showed a considerable concern pesticide contamination of all samples.
Resumo:
Optical full-field measurement methods such as Digital Image Correlation (DIC) provide a new opportunity for measuring deformations and vibrations with high spatial and temporal resolution. However, application to full-scale wind turbines is not trivial. Elaborate preparation of the experiment is vital and sophisticated post processing of the DIC results essential. In the present study, a rotor blade of a 3.2 MW wind turbine is equipped with a random black-and-white dot pattern at four different radial positions. Two cameras are located in front of the wind turbine and the response of the rotor blade is monitored using DIC for different turbine operations. In addition, a Light Detection and Ranging (LiDAR) system is used in order to measure the wind conditions. Wind fields are created based on the LiDAR measurements and used to perform aeroelastic simulations of the wind turbine by means of advanced multibody codes. The results from the optical DIC system appear plausible when checked against common and expected results. In addition, the comparison of relative out-of-plane blade deflections shows good agreement between DIC results and aeroelastic simulations.
Resumo:
The Pianosa Contourite Depositional System (CDS) is located in the Corsica Trough (Northern Tyrrhenian Sea), a confined basin dominated by mass transport and contour currents in the eastern flank and by turbidity currents in the western flank. The morphologic and stratigraphic characterisation of the Pianosa CDS is based on multibeam bathymetry, seismic reflection data (multi-channel high resolution mini GI gun, single-channel sparker and CHIRP), sediment cores and ADCP data. The Pianosa CDS is located at shallow to intermediate water depths (170 to 850 m water depth) and is formed under the influence of the Levantine Intermediate Water (LIW). It is 120 km long, has a maximum width of 10 km and is composed of different types of muddy sediment drifts: plastered drift, separated mounded drift, sigmoid drift and multicrested drift. The reduced tectonic activity in the Corsica Trough since the early Pliocene permits to recover a sedimentary record of the contourite depositional system that is only influenced by climate fluctuations. Contourites started to develop in the Middle-Late Pliocene, but their growth was enhanced since the Middle Pleistocene Transition (0.7–0.9 Ma). Although the general circulation of the LIW, flowing northwards in the Corsica Trough, remained active all along the history of the system, contourite drift formation changed, controlled by sediment influx and bottom current velocity. During periods of sea level fall, fast bottom currents often eroded the drift crest in the middle and upper slope. At that time the proximity of the coast to the shelf edge favoured the formation of bioclastic sand deposits winnowed by bottom currents. Higher sediment accumulation of mud in the drifts occurred during periods of fast bottom currents and high sediment availability (i.e. high activity of turbidity currents), coincident with periods of sea level low-stands. Condensed sections were formed during sea level high-stands, when bottom currents were more sluggish and the turbidite system was disconnected, resulting in a lower sediment influx.
Resumo:
Background: Among other causes the long-term result of hip prostheses in dogs is determined by aseptic loosening. A prevention of prosthesis complications can be achieved by an optimization of the tribological system which finally results in improved implant duration. In this context a computerized model for the calculation of hip joint loadings during different motions would be of benefit. In a first step in the development of such an inverse dynamic multi-body simulation (MBS-) model we here present the setup of a canine hind limb model applicable for the calculation of ground reaction forces. Methods: The anatomical geometries of the MBS-model have been established using computer tomography- (CT-) and magnetic resonance imaging- (MRI-) data. The CT-data were collected from the pelvis, femora, tibiae and pads of a mixed-breed adult dog. Geometric information about 22 muscles of the pelvic extremity of 4 mixed-breed adult dogs was determined using MRI. Kinematic and kinetic data obtained by motion analysis of a clinically healthy dog during a gait cycle (1 m/s) on an instrumented treadmill were used to drive the model in the multi-body simulation. Results and Discussion: As a result the vertical ground reaction forces (z-direction) calculated by the MBS-system show a maximum deviation of 1.75%BW for the left and 4.65%BW for the right hind limb from the treadmill measurements. The calculated peak ground reaction forces in z- and y-direction were found to be comparable to the treadmill measurements, whereas the curve characteristics of the forces in y-direction were not in complete alignment. Conclusion: In conclusion, it could be demonstrated that the developed MBS-model is suitable for simulating ground reaction forces of dogs during walking. In forthcoming investigations the model will be developed further for the calculation of forces and moments acting on the hip joint during different movements, which can be of help in context with the in silico development and testing of hip prostheses.
Resumo:
Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.
Resumo:
When designing systems that are complex, dynamic and stochastic in nature, simulation is generally recognised as one of the best design support technologies, and a valuable aid in the strategic and tactical decision making process. A simulation model consists of a set of rules that define how a system changes over time, given its current state. Unlike analytical models, a simulation model is not solved but is run and the changes of system states can be observed at any point in time. This provides an insight into system dynamics rather than just predicting the output of a system based on specific inputs. Simulation is not a decision making tool but a decision support tool, allowing better informed decisions to be made. Due to the complexity of the real world, a simulation model can only be an approximation of the target system. The essence of the art of simulation modelling is abstraction and simplification. Only those characteristics that are important for the study and analysis of the target system should be included in the simulation model. The purpose of simulation is either to better understand the operation of a target system, or to make predictions about a target system’s performance. It can be viewed as an artificial white-room which allows one to gain insight but also to test new theories and practices without disrupting the daily routine of the focal organisation. What you can expect to gain from a simulation study is very well summarised by FIRMA (2000). His idea is that if the theory that has been framed about the target system holds, and if this theory has been adequately translated into a computer model this would allow you to answer some of the following questions: · Which kind of behaviour can be expected under arbitrarily given parameter combinations and initial conditions? · Which kind of behaviour will a given target system display in the future? · Which state will the target system reach in the future? The required accuracy of the simulation model very much depends on the type of question one is trying to answer. In order to be able to respond to the first question the simulation model needs to be an explanatory model. This requires less data accuracy. In comparison, the simulation model required to answer the latter two questions has to be predictive in nature and therefore needs highly accurate input data to achieve credible outputs. These predictions involve showing trends, rather than giving precise and absolute predictions of the target system performance. The numerical results of a simulation experiment on their own are most often not very useful and need to be rigorously analysed with statistical methods. These results then need to be considered in the context of the real system and interpreted in a qualitative way to make meaningful recommendations or compile best practice guidelines. One needs a good working knowledge about the behaviour of the real system to be able to fully exploit the understanding gained from simulation experiments. The goal of this chapter is to brace the newcomer to the topic of what we think is a valuable asset to the toolset of analysts and decision makers. We will give you a summary of information we have gathered from the literature and of the experiences that we have made first hand during the last five years, whilst obtaining a better understanding of this exciting technology. We hope that this will help you to avoid some pitfalls that we have unwittingly encountered. Section 2 is an introduction to the different types of simulation used in Operational Research and Management Science with a clear focus on agent-based simulation. In Section 3 we outline the theoretical background of multi-agent systems and their elements to prepare you for Section 4 where we discuss how to develop a multi-agent simulation model. Section 5 outlines a simple example of a multi-agent system. Section 6 provides a collection of resources for further studies and finally in Section 7 we will conclude the chapter with a short summary.
Resumo:
Objectives: In contrast to other countries, surgery still represents the common invasive treatment for varicose veins in Germany. However, radiofrequency ablation, e.g. ClosureFast, becomes more and more popular in other countries due to potential better results and reduced side effects. This treatment option may cause less follow-up costs and is a more convenient procedure for patients, which could justify an introduction in the statutory benefits catalogue. Therefore, we aim at calculating the budget impact of a general reimbursement of ClosureFast in Germany. Methods: To assess the budget impact of including ClosureFast in the German statutory benefits catalogue, we developed a multi-cohort Markov model and compared the costs of a “World with ClosureFast” with a “World without ClosureFast” over a time horizon of five years. To address the uncertainty of input parameters, we conducted three different types of sensitivity analysis (one-way, scenario, probabilistic). Results: In the Base Case scenario, the introduction of the ClosureFast system for the treatment of varicose veins saves costs of about 19.1 Mio. € over a time horizon of five years in Germany. However, the results scatter in the sensitivity analyses due to limited evidence of some key input parameters. Conclusions: Results of the budget impact analysis indicate that a general reimbursement of ClosureFast has the potential to be cost-saving in the German Statutory Health Insurance.
Resumo:
Observing system experiments (OSEs) are carried out over a 1-year period to quantify the impact of Argo observations on the Mercator Ocean 0.25° global ocean analysis and forecasting system. The reference simulation assimilates sea surface temperature (SST), SSALTO/DUACS (Segment Sol multi-missions dALTimetrie, d'orbitographie et de localisation précise/Data unification and Altimeter combination system) altimeter data and Argo and other in situ observations from the Coriolis data center. Two other simulations are carried out where all Argo and half of the Argo data are withheld. Assimilating Argo observations has a significant impact on analyzed and forecast temperature and salinity fields at different depths. Without Argo data assimilation, large errors occur in analyzed fields as estimated from the differences when compared with in situ observations. For example, in the 0–300 m layer RMS (root mean square) differences between analyzed fields and observations reach 0.25 psu and 1.25 °C in the western boundary currents and 0.1 psu and 0.75 °C in the open ocean. The impact of the Argo data in reducing observation–model forecast differences is also significant from the surface down to a depth of 2000 m. Differences between in situ observations and forecast fields are thus reduced by 20 % in the upper layers and by up to 40 % at a depth of 2000 m when Argo data are assimilated. At depth, the most impacted regions in the global ocean are the Mediterranean outflow, the Gulf Stream region and the Labrador Sea. A significant degradation can be observed when only half of the data are assimilated. Therefore, Argo observations matter to constrain the model solution, even for an eddy-permitting model configuration. The impact of the Argo floats' data assimilation on other model variables is briefly assessed: the improvement of the fit to Argo profiles do not lead globally to unphysical corrections on the sea surface temperature and sea surface height. The main conclusion is that the performance of the Mercator Ocean 0.25° global data assimilation system is heavily dependent on the availability of Argo data.
Resumo:
En este trabajo se propone un nuevo sistema híbrido para el análisis de sentimientos en clase múltiple basado en el uso del diccionario General Inquirer (GI) y un enfoque jerárquico del clasificador Logistic Model Tree (LMT). Este nuevo sistema se compone de tres capas, la capa bipolar (BL) que consta de un LMT (LMT-1) para la clasificación de la polaridad de sentimientos, mientras que la segunda capa es la capa de la Intensidad (IL) y comprende dos LMTs (LMT-2 y LMT3) para detectar por separado tres intensidades de sentimientos positivos y tres intensidades de sentimientos negativos. Sólo en la fase de construcción, la capa de Agrupación (GL) se utiliza para agrupar las instancias positivas y negativas mediante el empleo de 2 k-means, respectivamente. En la fase de Pre-procesamiento, los textos son segmentados por palabras que son etiquetadas, reducidas a sus raíces y sometidas finalmente al diccionario GI con el objetivo de contar y etiquetar sólo los verbos, los sustantivos, los adjetivos y los adverbios con 24 marcadores que se utilizan luego para calcular los vectores de características. En la fase de Clasificación de Sentimientos, los vectores de características se introducen primero al LMT-1, a continuación, se agrupan en GL según la etiqueta de clase, después se etiquetan estos grupos de forma manual, y finalmente las instancias positivas son introducidas a LMT-2 y las instancias negativas a LMT-3. Los tres árboles están entrenados y evaluados usando las bases de datos Movie Review y SenTube con validación cruzada estratificada de 10-pliegues. LMT-1 produce un árbol de 48 hojas y 95 de tamaño, con 90,88% de exactitud, mientras que tanto LMT-2 y LMT-3 proporcionan dos árboles de una hoja y uno de tamaño, con 99,28% y 99,37% de exactitud,respectivamente. Los experimentos muestran que la metodología de clasificación jerárquica propuesta da un mejor rendimiento en comparación con otros enfoques prevalecientes.
Resumo:
In this contribution, a system identification procedure of a two-input Wiener model suitable for the analysis of the disturbance behavior of integrated nonlinear circuits is presented. The identified block model is comprised of two linear dynamic and one static nonlinear block, which are determined using an parameterized approach. In order to characterize the linear blocks, an correlation analysis using a white noise input in combination with a model reduction scheme is adopted. After having characterized the linear blocks, from the output spectrum under single tone excitation at each input a linear set of equations will be set up, whose solution gives the coefficients of the nonlinear block. By this data based black box approach, the distortion behavior of a nonlinear circuit under the influence of an interfering signal at an arbitrary input port can be determined. Such an interfering signal can be, for example, an electromagnetic interference signal which conductively couples into the port of consideration. © 2011 Author(s).
Resumo:
This work describes preliminary results of a two-modality imaging system aimed at the early detection of breast cancer. The first technique is based on compounding conventional echographic images taken at regular angular intervals around the imaged breast. The other modality obtains tomographic images of propagation velocity using the same circular geometry. For this study, a low-cost prototype has been built. It is based on a pair of opposed 128-element, 3.2 MHz array transducers that are mechanically moved around tissue mimicking phantoms. Compounded images around 360 degrees provide improved resolution, clutter reduction, artifact suppression and reinforce the visualization of internal structures. However, refraction at the skin interface must be corrected for an accurate image compounding process. This is achieved by estimation of the interface geometry followed by computing the internal ray paths. On the other hand, sound velocity tomographic images from time of flight projections have been also obtained. Two reconstruction methods, Filtered Back Projection (FBP) and 2D Ordered Subset Expectation Maximization (2D OSEM), were used as a first attempt towards tomographic reconstruction. These methods yield useable images in short computational times that can be considered as initial estimates in subsequent more complex methods of ultrasound image reconstruction. These images may be effective to differentiate malignant and benign masses and are very promising for breast cancer screening. (C) 2015 The Authors. Published by Elsevier B.V.