980 resultados para Real Options Theory
Resumo:
The purpose of my research was to examine how community-based organizations in the Niagara region provide programs for children with Autism Spectrum Disorder (ASD), who are considered to represent “extreme” or “severe” cases. A qualitative, comparative case study was conducted that focused on three organizations who provide summer recreation and activity programs, in order to examine the issues these organizations face when determining program structure and staff training; and to understand what the threshold for physical activity is in this type of setting, and how the unique needs surrounding these “severe” cases are met while attending the program. Purposeful sampling was employed to select a supervisor and senior staff member from each organization to discuss the training process, program development and implementation, and the resources and strategies used within their organization’s community-based program. A confirming comparative analysis was comparative analysis of a parents survey with six mothers whose children are considered “severe” indicated that camp staffs’ expectations are unrealistic where as the parents and supervisors have more realistic expectations within the “real world” of camp. There is no definition of “severe” or “extreme” and therefore severity is dependent upon the context.
Resumo:
Classical relational databases lack proper ways to manage certain real-world situations including imprecise or uncertain data. Fuzzy databases overcome this limitation by allowing each entry in the table to be a fuzzy set where each element of the corresponding domain is assigned a membership degree from the real interval [0…1]. But this fuzzy mechanism becomes inappropriate in modelling scenarios where data might be incomparable. Therefore, we become interested in further generalization of fuzzy database into L-fuzzy database. In such a database, the characteristic function for a fuzzy set maps to an arbitrary complete Brouwerian lattice L. From the query language perspectives, the language of fuzzy database, FSQL extends the regular Structured Query Language (SQL) by adding fuzzy specific constructions. In addition to that, L-fuzzy query language LFSQL introduces appropriate linguistic operations to define and manipulate inexact data in an L-fuzzy database. This research mainly focuses on defining the semantics of LFSQL. However, it requires an abstract algebraic theory which can be used to prove all the properties of, and operations on, L-fuzzy relations. In our study, we show that the theory of arrow categories forms a suitable framework for that. Therefore, we define the semantics of LFSQL in the abstract notion of an arrow category. In addition, we implement the operations of L-fuzzy relations in Haskell and develop a parser that translates algebraic expressions into our implementation.
Resumo:
Recent empirical evidence from vector autoregressions (VARs) suggests that public spending shocks increase (crowd in) private consumption. Standard general equilibrium models predict the opposite. We show that a standard real business cycle (RBC) model in which public spending is chosen optimally can rationalize the crowding-in effect documented in the VAR literature. When such a model is used as a data-generating process, a VAR estimated using the artificial data yields a positive consumption response to an increase in public spending, consistent with the empirical findings. This result holds regardless of whether private and public purchases are complements or substitutes.
Resumo:
Designing is a heterogeneous, fuzzily defined, floating field of various activities and chunks of ideas and knowledge. Available theories about the foundations of designing as presented in "the basic PARADOX" (Jonas and Meyer-Veden 2004) have evoked the impression of Babylonian confusion. We located the reasons for this "mess" in the "non-fit", which is the problematic relation of theories and subject field. There seems to be a comparable interface problem in theory-building as in designing itself. "Complexity" sounds promising, but turns out to be a problematic and not really helpful concept. I will argue for a more precise application of systemic and evolutionary concepts instead, which - in my view - are able to model the underlying generative structures and processes that produce the visible phenomenon of complexity. It does not make sense to introduce a new fashionable meta-concept and to hope for a panacea before having clarified the more basic and still equally problematic older meta-concepts. This paper will take one step away from "theories of what" towards practice and doing and try to have a closer look at existing process models or "theories of how" to design instead. Doing this from a systemic perspective leads to an evolutionary view of the process, which finally allows to specify more clearly the "knowledge gaps" inherent in the design process. This aspect has to be taken into account as constitutive of any attempt at theory-building in design, which can be characterized as a "practice of not-knowing". I conclude, that comprehensive "unified" theories, or methods, or process models run aground on the identified knowledge gaps, which allow neither reliable models of the present, nor reliable projections into the future. Consolation may be found in performing a shift from the effort of adaptation towards strategies of exaptation, which means the development of stocks of alternatives for coping with unpredictable situations in the future.
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
The principal objective of this paper is to identify the relationship between the results of the Canadian policies implemented to protect female workers against the impact of globalization on the garment industry and the institutional setting in which this labour market is immersed in Winnipeg. This research paper begins with a brief summary of the institutional theory approach that sheds light on the analysis of the effects of institutions on the policy options to protect female workers of the Winnipeg garment industry. Next, this paper identifies the set of beliefs, formal procedures, routines, norms and conventions that characterize the institutional environment of the female workers of Winnipeg’s garment industry. Subsequently, this paper describes the impact of free trade policies on the garment industry of Winnipeg. Afterward, this paper presents an analysis of the barriers that the institutional features of the garment sector in Winnipeg can set to the successful achievement of policy options addressed to protect the female workforce of this sector. Three policy options are considered: ethical purchasing; training/retraining programs and social engagement support for garment workers; and protection of migrated workers through promoting and facilitating bonds between Canada’s trade unions and trade unions of the labour sending countries. Finally, this paper concludes that the formation of isolated cultural groups inside of factories; the belief that there is gender and race discrimination on the part of the garment industry management against workers; the powerless social conditions of immigrant women; the economic rationality of garment factories’ managers; and the lack of political will on the part of Canada and the labour sending countries to set effective bilateral agreements to protect migrate workers, are the principal barriers that divide the actors involved in the garment industry in Winnipeg. This division among the principal actors of Winnipeg’s garment industry impedes the change toward more efficient institutions and, hence, the successful achievement of policy options addressed to protect women workers.
Resumo:
Les restriccions reals quantificades (QRC) formen un formalisme matemàtic utilitzat per modelar un gran nombre de problemes físics dins els quals intervenen sistemes d'equacions no-lineals sobre variables reals, algunes de les quals podent ésser quantificades. Els QRCs apareixen en nombrosos contextos, com l'Enginyeria de Control o la Biologia. La resolució de QRCs és un domini de recerca molt actiu dins el qual es proposen dos enfocaments diferents: l'eliminació simbòlica de quantificadors i els mètodes aproximatius. Tot i això, la resolució de problemes de grans dimensions i del cas general, resten encara problemes oberts. Aquesta tesi proposa una nova metodologia aproximativa basada en l'Anàlisi Intervalar Modal, una teoria matemàtica que permet resoldre problemes en els quals intervenen quantificadors lògics sobre variables reals. Finalment, dues aplicacions a l'Enginyeria de Control són presentades. La primera fa referència al problema de detecció de fallades i la segona consisteix en un controlador per a un vaixell a vela.
Resumo:
Formal and analytical models that contractors can use to assess and price project risk at the tender stage have proliferated in recent years. However, they are rarely used in practice. Introducing more models would, therefore, not necessarily help. A better understanding is needed of how contractors arrive at a bid price in practice, and how, and in what circumstances, risk apportionment actually influences pricing levels. More than 60 proposed risk models for contractors that are published in journals were examined and classified. Then exploratory interviews with five UK contractors and documentary analyses on how contractors price work generally and risk specifically were carried out to help in comparing the propositions from the literature to what contractors actually do. No comprehensive literature on the real bidding processes used in practice was found, and there is no evidence that pricing is systematic. Hence, systematic risk and pricing models for contractors may have no justifiable basis. Contractors process their bids through certain tendering gateways. They acknowledge the risk that they should price. However, the final settlement depends on a set of complex, micro-economic factors. Hence, risk accountability may be smaller than its true cost to the contractor. Risk apportionment occurs at three stages of the whole bid-pricing process. However, analytical approaches tend not to incorporate this, although they could.
Resumo:
In a previous paper (J. of Differential Equations, Vol. 249 (2010), 3081-3098) we examined a family of periodic Sturm-Liouville problems with boundary and interior singularities which are highly non-self-adjoint but have only real eigenvalues. We now establish Schatten class properties of the associated resolvent operator.
Resumo:
Another Proof of the Preceding Theory was produced as part of a residency run by Artists in Archeology in conjunction with the Stonehenge Riverside project. The film explores the relationship between science, work and ritual, imagining archaeology as a future cult. As two robed disciples stray off from the dig, they are drawn to the drone of the stones and proceed to play the henge like a gigantic Theremin. Just as a Theremin is played with the hand interfering in an electric circuit and producing sound without contact, so the stones respond to the choreographed bodily proximity. Finally, one of the two continues alone to the avenue at Avebury, where the magnetic pull of the stones reaches its climax. Shot on VHS, the film features a score by Zuzushi Monkey, with percussion and theremin sounds mirroring the action. The performers are mostly artists and archeologists from the art and archaeology teams. The archeologists were encouraged to perform their normal work in the robes, in an attempt to explore the meeting points of science and ritual and interrogate our relationship to an ultimately unknowable prehistoric past where activities we do not understand are relegated to the realm of religion. Stonehenge has unique acoustic properties, it’s large sarsen stones are finely worked on the inside, left rough on the outside, intensifying sound waves within the inner horseshoe, but since their real use, having been built over centuries, remains ambiguous, the film proposes that our attempts to decode them may themselves become encoded in their cumulative meaning for future researchers.
Resumo:
An efficient numerical self-consistent field theory (SCFT) algorithm is developed for treating structured polymers on spherical surfaces. The method solves the diffusion equations of SCFT with a pseudospectral approach that combines a spherical-harmonics expansion for the angular coordinates with a modified real-space Crank–Nicolson method for the radial direction. The self-consistent field equations are solved with Anderson-mixing iterations using dynamical parameters and an alignment procedure to prevent angular drift of the solution. A demonstration of the algorithm is provided for thin films of diblock copolymer grafted to the surface of a spherical core, in which the sequence of equilibrium morphologies is predicted as a function of diblock composition. The study reveals an array of interesting behaviors as the block copolymer pattern is forced to adapt to the finite surface area of the sphere.
Resumo:
This paper sets out the findings of a group of research and development projects carried out at the Department of Real Estate & Planning at the University of Reading and at Oxford Property Systems over the period 1999 – 2003. The projects have several aims: these are to identify the fundamental drivers of the pricing of different lease terms in the UK property sector; to identify current and best market practice and uncover the main variations in lease terms; to identify key issues in pricing lease terms; and to develop a model for the pricing of rent under a variety of lease variations. From the landlord’s perspective, the main factors driving the required ‘compensation’ for a lease term amendment include expected rental volatility, expected probability of tenant vacation, and the expected costs of tenant vacation. These data are used in conjunction with simulation technology to reflect the options inherent in certain lease types to explore the required rent adjustment. The resulting cash flows have interesting qualities which illustrate the potential importance of option pricing in a non-complex and practical way.
Resumo:
For over twenty years researchers have been recommending that investors diversify their portfolios by adding direct real estate. Based on the tenets of modern portfolio theory (MPT) investors are told that the primary reason they should include direct real estate is that they will enjoy decreased volatility (risk) through increased diversification. However, the MPT methodology hides where this reduction in risk originates. To over come this deficiency we use a four-quadrant approach to break down the co-movement between direct real estate and equities and bonds into negative and positive periods. Then using data for the last 25-years we show that for about 70% of the time a holding in direct real estate would have hurt portfolio returns, i.e. when the other assets showed positive performance. In other words, for only about 30% of the time would a holding in direct real estate lead to improvements in portfolio returns. However, this increase in performance occurs when the alternative asset showed negative returns. In addition, adding direct real estate always leads to reductions in portfolio risk, especially on the downside. In other words, although adding direct real estate helps the investor to avoid large losses it also reduces the potential for large gains. Thus, if the goal of the investor is offsetting losses, then the results show that direct real estate would have been of some benefit. So in answer to the question when does direct real estate improve portfolio performance the answer is on the downside, i.e. when it is most needed.