921 resultados para Complex problems


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We extend extreme learning machine (ELM) classifiers to complex Reproducing Kernel Hilbert Spaces (RKHS) where the input/output variables as well as the optimization variables are complex-valued. A new family of classifiers, called complex-valued ELM (CELM) suitable for complex-valued multiple-input–multiple-output processing is introduced. In the proposed method, the associated Lagrangian is computed using induced RKHS kernels, adopting a Wirtinger calculus approach formulated as a constrained optimization problem similarly to the conventional ELM classifier formulation. When training the CELM, the Karush–Khun–Tuker (KKT) theorem is used to solve the dual optimization problem that consists of satisfying simultaneously smallest training error as well as smallest norm of output weights criteria. The proposed formulation also addresses aspects of quaternary classification within a Clifford algebra context. For 2D complex-valued inputs, user-defined complex-coupled hyper-planes divide the classifier input space into four partitions. For 3D complex-valued inputs, the formulation generates three pairs of complex-coupled hyper-planes through orthogonal projections. The six hyper-planes then divide the 3D space into eight partitions. It is shown that the CELM problem formulation is equivalent to solving six real-valued ELM tasks, which are induced by projecting the chosen complex kernel across the different user-defined coordinate planes. A classification example of powdered samples on the basis of their terahertz spectral signatures is used to demonstrate the advantages of the CELM classifiers compared to their SVM counterparts. The proposed classifiers retain the advantages of their ELM counterparts, in that they can perform multiclass classification with lower computational complexity than SVM classifiers. Furthermore, because of their ability to perform classification tasks fast, the proposed formulations are of interest to real-time applications.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper is concerned with an overview of upwinding schemes, and further nonlinear applications of a recently introduced high resolution upwind differencing scheme, namely the ADBQUICKEST [V.G. Ferreira, F.A. Kurokawa, R.A.B. Queiroz, M.K. Kaibara, C.M. Oishi, J.A.Cuminato, A.F. Castelo, M.F. Tomé, S. McKee, assessment of a high-order finite difference upwind scheme for the simulation of convection-diffusion problems, International Journal for Numerical Methods in Fluids 60 (2009) 1-26]. The ADBQUICKEST scheme is a new TVD version of the QUICKEST [B.P. Leonard, A stable and accurate convective modeling procedure based on quadratic upstream interpolation, Computer Methods in Applied Mechanics and Engineering 19 (1979) 59-98] for solving nonlinear balance laws. The scheme is based on the concept of NV and TVD formalisms and satisfies a convective boundedness criterion. The accuracy of the scheme is compared with other popularly used convective upwinding schemes (see, for example, Roe (1985) [19], Van Leer (1974) [18] and Arora & Roe (1997) [17]) for solving nonlinear conservation laws (for example, Buckley-Leverett, shallow water and Euler equations). The ADBQUICKEST scheme is then used to solve six types of fluid flow problems of increasing complexity: namely, 2D aerosol filtration by fibrous filters; axisymmetric flow in a tubular membrane; 2D two-phase flow in a fluidized bed; 2D compressible Orszag-Tang MHD vortex; axisymmetric jet onto a flat surface at low Reynolds number and full 3D incompressible flows involving moving free surfaces. The numerical simulations indicate that this convective upwinding scheme is a good generic alternative for solving complex fluid dynamics problems. © 2012.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In vielen Bereichen der industriellen Fertigung, wie zum Beispiel in der Automobilindustrie, wer- den digitale Versuchsmodelle (sog. digital mock-ups) eingesetzt, um die Entwicklung komplexer Maschinen m ̈oglichst gut durch Computersysteme unterstu ̈tzen zu k ̈onnen. Hierbei spielen Be- wegungsplanungsalgorithmen eine wichtige Rolle, um zu gew ̈ahrleisten, dass diese digitalen Pro- totypen auch kollisionsfrei zusammengesetzt werden k ̈onnen. In den letzten Jahrzehnten haben sich hier sampling-basierte Verfahren besonders bew ̈ahrt. Diese erzeugen eine große Anzahl von zuf ̈alligen Lagen fu ̈r das ein-/auszubauende Objekt und verwenden einen Kollisionserken- nungsmechanismus, um die einzelnen Lagen auf Gu ̈ltigkeit zu u ̈berpru ̈fen. Daher spielt die Kollisionserkennung eine wesentliche Rolle beim Design effizienter Bewegungsplanungsalgorith- men. Eine Schwierigkeit fu ̈r diese Klasse von Planern stellen sogenannte “narrow passages” dar, schmale Passagen also, die immer dort auftreten, wo die Bewegungsfreiheit der zu planenden Objekte stark eingeschr ̈ankt ist. An solchen Stellen kann es schwierig sein, eine ausreichende Anzahl von kollisionsfreien Samples zu finden. Es ist dann m ̈oglicherweise n ̈otig, ausgeklu ̈geltere Techniken einzusetzen, um eine gute Performance der Algorithmen zu erreichen.rnDie vorliegende Arbeit gliedert sich in zwei Teile: Im ersten Teil untersuchen wir parallele Kollisionserkennungsalgorithmen. Da wir auf eine Anwendung bei sampling-basierten Bewe- gungsplanern abzielen, w ̈ahlen wir hier eine Problemstellung, bei der wir stets die selben zwei Objekte, aber in einer großen Anzahl von unterschiedlichen Lagen auf Kollision testen. Wir im- plementieren und vergleichen verschiedene Verfahren, die auf Hu ̈llk ̈operhierarchien (BVHs) und hierarchische Grids als Beschleunigungsstrukturen zuru ̈ckgreifen. Alle beschriebenen Verfahren wurden auf mehreren CPU-Kernen parallelisiert. Daru ̈ber hinaus vergleichen wir verschiedene CUDA Kernels zur Durchfu ̈hrung BVH-basierter Kollisionstests auf der GPU. Neben einer un- terschiedlichen Verteilung der Arbeit auf die parallelen GPU Threads untersuchen wir hier die Auswirkung verschiedener Speicherzugriffsmuster auf die Performance der resultierenden Algo- rithmen. Weiter stellen wir eine Reihe von approximativen Kollisionstests vor, die auf den beschriebenen Verfahren basieren. Wenn eine geringere Genauigkeit der Tests tolerierbar ist, kann so eine weitere Verbesserung der Performance erzielt werden.rnIm zweiten Teil der Arbeit beschreiben wir einen von uns entworfenen parallelen, sampling- basierten Bewegungsplaner zur Behandlung hochkomplexer Probleme mit mehreren “narrow passages”. Das Verfahren arbeitet in zwei Phasen. Die grundlegende Idee ist hierbei, in der er- sten Planungsphase konzeptionell kleinere Fehler zuzulassen, um die Planungseffizienz zu erh ̈ohen und den resultierenden Pfad dann in einer zweiten Phase zu reparieren. Der hierzu in Phase I eingesetzte Planer basiert auf sogenannten Expansive Space Trees. Zus ̈atzlich haben wir den Planer mit einer Freidru ̈ckoperation ausgestattet, die es erlaubt, kleinere Kollisionen aufzul ̈osen und so die Effizienz in Bereichen mit eingeschr ̈ankter Bewegungsfreiheit zu erh ̈ohen. Optional erlaubt unsere Implementierung den Einsatz von approximativen Kollisionstests. Dies setzt die Genauigkeit der ersten Planungsphase weiter herab, fu ̈hrt aber auch zu einer weiteren Perfor- mancesteigerung. Die aus Phase I resultierenden Bewegungspfade sind dann unter Umst ̈anden nicht komplett kollisionsfrei. Um diese Pfade zu reparieren, haben wir einen neuartigen Pla- nungsalgorithmus entworfen, der lokal beschr ̈ankt auf eine kleine Umgebung um den bestehenden Pfad einen neuen, kollisionsfreien Bewegungspfad plant.rnWir haben den beschriebenen Algorithmus mit einer Klasse von neuen, schwierigen Metall- Puzzlen getestet, die zum Teil mehrere “narrow passages” aufweisen. Unseres Wissens nach ist eine Sammlung vergleichbar komplexer Benchmarks nicht ̈offentlich zug ̈anglich und wir fan- den auch keine Beschreibung von vergleichbar komplexen Benchmarks in der Motion-Planning Literatur.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

After the 10th Iranian Presidential election on June 12, 2009, several public opinion polls taken in Iran attracted the attention of policy-makers and journalists around the world because of the political crisis that followed. In this paper I first review critically the polls conducted by the WPO (WorldPublicOpinion.org), PIPA (Program on International Policy Attitudes) at the University of Maryland. I also review an essay by Steven Kull, which is based on the aforementioned poll results and which in my opinion leads to false conclusions concerning Iran’s political prospects. I also discuss “An Analysis of Multiple Polls of the Iranian Public,” published by WPO-PIPA on February 3 2010. The present paper arrives at the overall conclusion that it is impossible to obtain an accurate image of political opinions in societies as complicated as that of Iran by concentrating on only one technique of research and analysis, especially when the political and social situation in the society concerned is in a state of constant flux.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

"A paper presented at the Operations Research Society of America, Nineteenth National Meeting, Chicago, Illinois, May 25-26, 1961."

Relevância:

40.00% 40.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 42C05.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research presented in this thesis addresses inherent problems in signaturebased intrusion detection systems (IDSs) operating in heterogeneous environments. The research proposes a solution to address the difficulties associated with multistep attack scenario specification and detection for such environments. The research has focused on two distinct problems: the representation of events derived from heterogeneous sources and multi-step attack specification and detection. The first part of the research investigates the application of an event abstraction model to event logs collected from a heterogeneous environment. The event abstraction model comprises a hierarchy of events derived from different log sources such as system audit data, application logs, captured network traffic, and intrusion detection system alerts. Unlike existing event abstraction models where low-level information may be discarded during the abstraction process, the event abstraction model presented in this work preserves all low-level information as well as providing high-level information in the form of abstract events. The event abstraction model presented in this work was designed independently of any particular IDS and thus may be used by any IDS, intrusion forensic tools, or monitoring tools. The second part of the research investigates the use of unification for multi-step attack scenario specification and detection. Multi-step attack scenarios are hard to specify and detect as they often involve the correlation of events from multiple sources which may be affected by time uncertainty. The unification algorithm provides a simple and straightforward scenario matching mechanism by using variable instantiation where variables represent events as defined in the event abstraction model. The third part of the research looks into the solution to address time uncertainty. Clock synchronisation is crucial for detecting multi-step attack scenarios which involve logs from multiple hosts. Issues involving time uncertainty have been largely neglected by intrusion detection research. The system presented in this research introduces two techniques for addressing time uncertainty issues: clock skew compensation and clock drift modelling using linear regression. An off-line IDS prototype for detecting multi-step attacks has been implemented. The prototype comprises two modules: implementation of the abstract event system architecture (AESA) and of the scenario detection module. The scenario detection module implements our signature language developed based on the Python programming language syntax and the unification-based scenario detection engine. The prototype has been evaluated using a publicly available dataset of real attack traffic and event logs and a synthetic dataset. The distinct features of the public dataset are the fact that it contains multi-step attacks which involve multiple hosts with clock skew and clock drift. These features allow us to demonstrate the application and the advantages of the contributions of this research. All instances of multi-step attacks in the dataset have been correctly identified even though there exists a significant clock skew and drift in the dataset. Future work identified by this research would be to develop a refined unification algorithm suitable for processing streams of events to enable an on-line detection. In terms of time uncertainty, identified future work would be to develop mechanisms which allows automatic clock skew and clock drift identification and correction. The immediate application of the research presented in this thesis is the framework of an off-line IDS which processes events from heterogeneous sources using abstraction and which can detect multi-step attack scenarios which may involve time uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Engineering assets such as roads, rail, bridges and other forms of public works are vital to the effective functioning of societies {Herder, 2006 #128}. Proficient provision of this physical infrastructure is therefore one of the key activities of government {Lædre, 2006 #123}. In order to ensure engineering assets are procured and maintained on behalf of citizens, government needs to devise the appropriate policy and institutional architecture for this purpose. The changing institutional arrangements around the procurement of engineering assets are the focus of this paper. The paper describes and analyses the transition to new, more collaborative forms of procurement arrangements which are becoming increasingly prevalent in Australia and other OECD countries. Such fundamental shifts from competitive to more collaborative approaches to project governance can be viewed as a major transition in procurement system arrangements. In many ways such changes mirror the shift from New Public Management, with its emphasis on the use of market mechanisms to achieve efficiencies {Hood, 1991 #166}, towards more collaborative approaches to service delivery, such as those under network governance arrangements {Keast, 2007 #925}. However, just as traditional forms of procurement in a market context resulted in unexpected outcomes for industry, such as a fragmented industry afflicted by chronic litigation {Dubois, 2002 #9}, the change to more collaborative forms of procurement is unlikely to be a panacea to the problems of procurement, and may well also have unintended consequences. This paper argues that perspectives from complex adaptive systems (CAS) theory can contribute to the theory and practice of managing system transitions. In particular the concept of emergence provides a key theoretical construct to understand the aggregate effect that individual project governance arrangements can have upon the structure of specific industries, which in turn impact individual projects. Emergence is understood here as the macro structure that emerges out of the interaction of agents in the system {Holland, 1998 #100; Tang, 2006 #51}.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Construction projects are faced with a challenge that must not be underestimated. These projects are increasingly becoming highly competitive, more complex, and difficult to manage. They become ‘wicked problems’, which are difficult to solve using traditional approaches. Soft Systems Methodology (SSM) is a systems approach that is used for analysis and problem solving in such complex and messy situations. SSM uses “systems thinking” in a cycle of action research, learning and reflection to help understand the various perceptions that exist in the minds of the different people involved in the situation. This paper examines the benefits of applying SSM to wicked problems in construction project management, especially those situations that are challenging to understand and difficult to act upon. It includes relevant examples of its use in dealing with the confusing situations that incorporate human, organizational and technical aspects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Homelessness is a complex problem that manifests in all societies. This intractable and ‘wicked’ issue resists single-agency solutions and its resolution and requires a large, on-going investment of financial and professional resources that few organisations can sustain. This paper adopts a social innovation framework to examine government and community sector responses to homelessness. While recent evaluations and policy prescriptions have suggested better integrated and more co-ordinated service delivery models for addressing homelessness, there is little understanding of the innovation framework in which alternative service system paradigms emerge. A framework that identifies/distils and explains different innovation levels is put forward. The framework highlights that while government may lead strategic level innovations, community organisations are active in developing innovation at the service and client level. Moreover, community organisations may be unaware of the innovative capacity that resides in their creative responses to resolving social crisis and marginalisation through being without shelter.