923 resultados para Real Options Theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Great Barrier Reef Marine Park, an area almost the size , of Japan, has a new network of no-take areas that significantly improves the protection of biodiversity. The new marine park zoning implements, in a quantitative manner, many of the theoretical design principles discussed in the literature. For example, the new network of no-take areas has at least 20% protection per bioregion, minimum levels of protection for all known habitats and special or unique features, and minimum sizes for no-take areas of at least 10 or 20 kat across at the smallest diameter Overall, more than 33% of the Great Barrier Reef Marine Park is now in no-take areas (previously 4.5%). The steps taken leading to this outcome were to clarify to the interested public why the existing level of protection wets inadequate; detail the conservation objectives of establishing new no-take areas; work with relevant and independent experts to define, and contribute to, the best scientific process to deliver on the objectives; describe the biodiversity (e.g., map bioregions); define operational principles needed to achieve the objectives; invite community input on all of The above; gather and layer the data gathered in round-table discussions; report the degree of achievement of principles for various options of no-take areas; and determine how to address negative impacts. Some of the key success factors in this case have global relevance and include focusing initial communication on the problem to be addressed; applying the precautionary principle; using independent experts; facilitating input to decision making; conducting extensive and participatory consultation; having an existing marine park that encompassed much of the ecosystem; having legislative power under federal law; developing high-level support; ensuring agency Priority and ownership; and being able to address the issue of displaced fishers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prior research demonstrates that understanding theory of mind (ToM) is seriously and similarly delayed in late-signing deaf children and children with autism. Are these children simply delayed in timing relative to typical children, or do they demonstrate different patterns of development? The current research addressed this question by testing 145 children (ranging from 3 to 13 years) with deafness, autism, or typical development using a ToM scale. Results indicate that all groups followed the same sequence of steps, up to a point, but that children with autism showed an importantly different sequence of understandings (in the later steps of the progression) relative to all other groups.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time control programs are often used in contexts where (conceptually) they run forever. Repetitions within such programs (or their specifications) may either (i) be guaranteed to terminate, (ii) be guaranteed to never terminate (loop forever), or (iii) may possibly terminate. In dealing with real-time programs and their specifications, we need to be able to represent these possibilities, and define suitable refinement orderings. A refinement ordering based on Dijkstra's weakest precondition only copes with the first alternative. Weakest liberal preconditions allow one to constrain behaviour provided the program terminates, which copes with the third alternative to some extent. However, neither of these handles the case when a program does not terminate. To handle this case a refinement ordering based on relational semantics can be used. In this paper we explore these issues and the definition of loops for real-time programs as well as corresponding refinement laws.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inference and optimization of real-value edge variables in sparse graphs are studied using the Bethe approximation and replica method of statistical physics. Equilibrium states of general energy functions involving a large set of real edge variables that interact at the network nodes are obtained in various cases. When applied to the representative problem of network resource allocation, efficient distributed algorithms are also devised. Scaling properties with respect to the network connectivity and the resource availability are found, and links to probabilistic Bayesian approximation methods are established. Different cost measures are considered and algorithmic solutions in the various cases are devised and examined numerically. Simulation results are in full agreement with the theory. © 2007 The American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is generally assumed when using Bayesian inference methods for neural networks that the input data contains no noise. For real-world (errors in variable) problems this is clearly an unsafe assumption. This paper presents a Bayesian neural network framework which accounts for input noise provided that a model of the noise process exists. In the limit where the noise process is small and symmetric it is shown, using the Laplace approximation, that this method adds an extra term to the usual Bayesian error bar which depends on the variance of the input noise process. Further, by treating the true (noiseless) input as a hidden variable, and sampling this jointly with the network’s weights, using a Markov chain Monte Carlo method, it is demonstrated that it is possible to infer the regression over the noiseless input. This leads to the possibility of training an accurate model of a system using less accurate, or more uncertain, data. This is demonstrated on both the, synthetic, noisy sine wave problem and a real problem of inferring the forward model for a satellite radar backscatter system used to predict sea surface wind vectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Relationships with supervisors are a major source of negative emotions at work, but little is known about why this is so. The aim of the research was to use attachment theory (Bowlby, 1969, 1973; 1980) as a framework for investigating the nature and causes of employee negative emotional experiences, in the context of their supervisory relationships. The research was conducted in three stages. In Stage 1 two studies were conducted to develop a measure of employee perceptions of supervisor caregiving (SCS). Results indicated that the 20-item scale had good reliability and validity. Stage 2 required participants (N=183) to complete a questionnaire that was designed to examine the roles of supervisor caregiving and working models (specific and global) in determining cognitive and emotional responses to hypothetical supervisor behaviours. The results provided partial support for an Independent Effects Model. Supervisor caregiving predicted specific anxiety and avoidance. In tum, both dimensions of attachment predicted negative emotions, but this relationship was mediated by event interpretation only in the case of avoidance. Global models made a smaller but significant contribution to negative emotions overall. There was no support for an interaction effect between specific and global models in determining event interpretation. In stage 3 a sub-sample of questionnaire respondents (N=24) were interviewed about 'real-life' caregiving and negative emotional experiences in their supervisory relationships. Secure individuals experienced supervisors as consistently warm, available, and responsive. They reported few negative events or emotions. Individuals with insecure specific working models experienced rejecting or inconsistent supervisor caregiving. They were sensitised to trust and closeness issues in their relationships, and reported negative events and emotions underpinned by these themes. Overall, results broadly supported attachment theory predictions. It is concluded that an attachment theory perspective provides new insight into the nature and causes of employee negative emotions in supervisory relationships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Removing noise from piecewise constant (PWC) signals is a challenging signal processing problem arising in many practical contexts. For example, in exploration geosciences, noisy drill hole records need to be separated into stratigraphic zones, and in biophysics, jumps between molecular dwell states have to be extracted from noisy fluorescence microscopy signals. Many PWC denoising methods exist, including total variation regularization, mean shift clustering, stepwise jump placement, running medians, convex clustering shrinkage and bilateral filtering; conventional linear signal processing methods are fundamentally unsuited. This paper (part I, the first of two) shows that most of these methods are associated with a special case of a generalized functional, minimized to achieve PWC denoising. The minimizer can be obtained by diverse solver algorithms, including stepwise jump placement, convex programming, finite differences, iterated running medians, least angle regression, regularization path following and coordinate descent. In the second paper, part II, we introduce novel PWC denoising methods, and comparisons between these methods performed on synthetic and real signals, showing that the new understanding of the problem gained in part I leads to new methods that have a useful role to play.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: Current conceptualisations of strategic flexibility and its antecedents are theory-driven, which has resulted in a lack of consensus. To summarise this domain the paper aims to develop and present an a priori conceptual model of the antecedents and outcomes of strategic flexibility. Discussion and insights into the conceptual model, and the relationships specified, are made through a novel qualitative empirical approach. The implications for further research and a framework for further theoretical development are presented. Design/methodology/approach: An exploratory qualitative research design is used applying multiple data collection techniques in a branch network of a large regional retailer in the UK. The development of strategic options and the complex relationship to strategic flexibility is investigated. Findings: The number and type of strategic options developed by managers impact on the degree of strategic flexibility and also on the ability of the firm to achieve competitive differentiation. Additionally, the type of strategic option implemented by managers is dependent on the competitive situation faced at a local level. Evidence of managers' limited perception of competition was identified based on their spatial embeddedness. Research limitations/implications: A single, in-depth case study was used. The data gathered is rich and appropriate for the exploratory approach adopted here. However, generalisability of the findings is limited. Practical implications: Strategic flexibility is rooted in the ability of front-line mangers to develop and implement strategic options; this in turn facilitates competitive differentiation. Originality/value: The research presented is unique in this domain on two accounts. First, theory is developed by presenting an a priori conceptual model, and testing through in-depth qualitative data gathering. Second, insights into strategic flexibility are presented through an examination of managerial cognition, resources and strategic option generation using cognitive mapping and laddering technique. © Emerald Group Publishing Limited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent research in literacy acquisition has generated detailed programs for teaching phonological awareness. The current paper will address three issues that follow from this research. Firstly, much of the past research has been conducted under conditions that are divorced from the classroom. As a result, it is not known whether the suggested teaching strategies will lead to an increase in children’s attainments when integrated into a broad reading curriculum implemented by teachers in mainstream classrooms. Secondly, these phonological interventions have been designed either to prevent the occurrence of reading difficulties or to meet the needs of failing readers. Therefore, it is not known whether the same methods would advantage all children. Thirdly, teaching children to read takes a minimum of two to three academic years. We herefore need to develop a reading curriculum that can provide the progression and differentiation to meet a wide range of needs over several academic years. We report two studies that have addressed these issues through monitoring the impact of a reading curriculum, implemented by teachers, which integrated children’s acquisition of phonological skills with broader aspects of teaching reading over three academic years. The attainments of children at all levels of ability in the experimental group were raised relative to controls, and importantly, these gains were maintained after the intervention was withdrawn. These results demonstrate that phonological awareness training can be successfully integrated into real classroom contexts and that the same methods raised the attainments of normally developing children, as well as those at risk of reading failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the design and results of a task-based user study, based on Information Foraging Theory, on a novel user interaction framework - uInteract - for content-based image retrieval (CBIR). The framework includes a four-factor user interaction model and an interactive interface. The user study involves three focused evaluations, 12 simulated real life search tasks with different complexity levels, 12 comparative systems and 50 subjects. Information Foraging Theory is applied to the user study design and the quantitative data analysis. The systematic findings have not only shown how effective and easy to use the uInteract framework is, but also illustrate the value of Information Foraging Theory for interpreting user interaction with CBIR. © 2011 Springer-Verlag Berlin Heidelberg.