965 resultados para 2nd-order perturbation-theory
Resumo:
Mathematics Subject Classification: 26A33, 74B20, 74D10, 74L15
Resumo:
Binocular combination for first-order (luminancedefined) stimuli has been widely studied, but we know rather little about this binocular process for spatial modulations of contrast (second-order stimuli). We used phase-matching and amplitude-matching tasks to assess binocular combination of second-order phase and modulation depth simultaneously. With fixed modulation in one eye, we found that binocularly perceived phase was shifted, and perceived amplitude increased almost linearly as modulation depth in the other eye increased. At larger disparities, the phase shift was larger and the amplitude change was smaller. The degree of interocular correlation of the carriers had no influence. These results can be explained by an initial extraction of the contrast envelopes before binocular combination (consistent with the lack of dependence on carrier correlation) followed by a weighted linear summation of second-order modulations in which the weights (gains) for each eye are driven by the first-order carrier contrasts as previously found for first-order binocular combination. Perceived modulation depth fell markedly with increasing phase disparity unlike previous findings that perceived first-order contrast was almost independent of phase disparity. We present a simple revision to a widely used interocular gain-control theory that unifies first- and second-order binocular summation with a single principle-contrast-weighted summation-and we further elaborate the model for first-order combination. Conclusion: Second-order combination is controlled by first-order contrast.
Resumo:
This review is concerned with nanoscale effects in highly transparent dielectric photonic structures fabricated from optical fibers. In contrast to those in plasmonics, these structures do not contain metal particles, wires, or films with nanoscale dimensions. Nevertheless, a nanoscale perturbation of the fiber radius can significantly alter their performance. This paper consists of three parts. The first part considers propagation of light in thin optical fibers (microfibers) having the radius of the order of 100 nanometers to 1 micron. The fundamental mode propagating along a microfiber has an evanescent field which may be strongly expanded into the external area. Then, the cross-sectional dimensions of the mode and transmission losses are very sensitive to small variations of the microfiber radius. Under certain conditions, a change of just a few nanometers in the microfiber radius can significantly affect its transmission characteristics and, in particular, lead to the transition from the waveguiding to non-waveguiding regime. The second part of the review considers slow propagation of whispering gallery modes in fibers having the radius of the order of 10–100 microns. The propagation of these modes along the fiber axis is so slow that they can be governed by extremely small nanoscale changes of the optical fiber radius. This phenomenon is exploited in SNAP (surface nanoscale axial photonics), a new platform for fabrication of miniature super-low-loss photonic integrated circuits with unprecedented sub-angstrom precision. The SNAP theory and applications are overviewed. The third part of this review describes methods of characterization of the radius variation of microfibers and regular optical fibers with sub-nanometer precision.
Resumo:
The present article assesses agency theory related problems contributing to the fall of shopping centers. The negative effects of the financial and economic downturn started in 2008 were accentuated in emerging markets like Romania. Several shopping centers were closed or sold through bankruptcy proceedings or forced execution. These failed shopping centers, 10 in number, were selected in order to assess agency theory problems contributing to the failure of shopping centers; as research method qualitative multiple cases-studies is used. Results suggest, that in all of the cases the risk adverse behavior of the External Investor- Principal, lead to risk sharing problems and subsequently to the fall of the shopping centers. In some of the cases Moral Hazard (lack of Developer-Agent’s know-how and experience) as well as Adverse Selection problems could be identified. The novelty of the topic for the shopping center industry and the empirical evidences confer a significant academic and practical value to the present article.
Resumo:
A vállalkozási tevékenység a munkahelyteremtés és a gazdasági növekedés egyik döntő tényezője. Ennek a jövőbeli kedvező alakulását a fiatalok mai attitűdjei határozzák meg. Ha be tudjuk azonosítani a legfontosabb tényezőket, amelyek a fiatalok vállalkozásindítási szándékát befolyásolják, el tudjuk dönteni azt is, mely területen lehet és érdemes beavatkozni annak érdekében, hogy minél több új, életképes vállalkozás szülessen. A cikk a GUESSS kutatási projekt magyarországi adatbázisának köszönhetően közel 6000 hallgató válaszait elemezve rendszerezi a felsőoktatásban résztvevők vállalkozásindítási szándékára ható legfontosabb tényezőcsoportokat. Első lépésben Ajzen tervezett magatartás elméletének alkalmazásával vizsgálja a vállalkozásindítási szándék alakítóit, majd további tényezők, így a felsőoktatási intézmények által nyújtott szolgáltatások, a családi háttér és a demográfiai jellemzők bevonásával igyekszik minél pontosabban leírni a szándék alakulását. _____ Entrepreneurial activity is a decisive factor in the dynamics of job creation and economic growth. The future level of this activity highly depends on the attitudes of today’s youth towards this. If the most important factors influencing attitudes are identified and the entrepreneurial intentions towards entrepreneurship are determined, the fields of intervention targeting the creation of as many new and viable enterprises as possible can be defined. This article aims to systematise the most important factor groups that influence the decisions of students studying in higher education in terms of start-up activities and is based on the Hungarian database of the GUESSS research project, containing almost 6000 respondents. Firstly, it tests Ajzen’s Theory of Planned Behavior. Then such factors as supportive services provided by higher education institutions, family background and demographic factors are analysed in order to improve the explanatory power of the model.
Resumo:
Secrecy is fundamental to computer security, but real systems often cannot avoid leaking some secret information. For this reason, the past decade has seen growing interest in quantitative theories of information flow that allow us to quantify the information being leaked. Within these theories, the system is modeled as an information-theoretic channel that specifies the probability of each output, given each input. Given a prior distribution on those inputs, entropy-like measures quantify the amount of information leakage caused by the channel. ^ This thesis presents new results in the theory of min-entropy leakage. First, we study the perspective of secrecy as a resource that is gradually consumed by a system. We explore this intuition through various models of min-entropy consumption. Next, we consider several composition operators that allow smaller systems to be combined into larger systems, and explore the extent to which the leakage of a combined system is constrained by the leakage of its constituents. Most significantly, we prove upper bounds on the leakage of a cascade of two channels, where the output of the first channel is used as input to the second. In addition, we show how to decompose a channel into a cascade of channels. ^ We also establish fundamental new results about the recently-proposed g-leakage family of measures. These results further highlight the significance of channel cascading. We prove that whenever channel A is composition refined by channel B, that is, whenever A is the cascade of B and R for some channel R, the leakage of A never exceeds that of B, regardless of the prior distribution or leakage measure (Shannon leakage, guessing entropy leakage, min-entropy leakage, or g-leakage). Moreover, we show that composition refinement is a partial order if we quotient away channel structure that is redundant with respect to leakage alone. These results are strengthened by the proof that composition refinement is the only way for one channel to never leak more than another with respect to g-leakage. Therefore, composition refinement robustly answers the question of when a channel is always at least as secure as another from a leakage point of view.^
Resumo:
Trials in a temporal two-interval forced-choice discrimination experiment consist of two sequential intervals presenting stimuli that differ from one another as to magnitude along some continuum. The observer must report in which interval the stimulus had a larger magnitude. The standard difference model from signal detection theory analyses poses that order of presentation should not affect the results of the comparison, something known as the balance condition (J.-C. Falmagne, 1985, in Elements of Psychophysical Theory). But empirical data prove otherwise and consistently reveal what Fechner (1860/1966, in Elements of Psychophysics) called time-order errors, whereby the magnitude of the stimulus presented in one of the intervals is systematically underestimated relative to the other. Here we discuss sensory factors (temporary desensitization) and procedural glitches (short interstimulus or intertrial intervals and response bias) that might explain the time-order error, and we derive a formal model indicating how these factors make observed performance vary with presentation order despite a single underlying mechanism. Experimental results are also presented illustrating the conventional failure of the balance condition and testing the hypothesis that time-order errors result from contamination by the factors included in the model.
Resumo:
This paper reports on an investigation with first year undergraduate Product Design and Management students within a School of Engineering. The students at the time of this investigation had studied fundamental engineering science and mathematics for one semester. The students were given an open ended, ill formed problem which involved designing a simple bridge to cross a river. They were given a talk on problem solving and given a rubric to follow, if they chose to do so. They were not given any formulae or procedures needed in order to resolve the problem. In theory, they possessed the knowledge to ask the right questions in order to make assumptions but, in practice, it turned out they were unable to link their a priori knowledge to resolve this problem. They were able to solve simple beam problems when given closed questions. The results show they were unable to visualise a simple bridge as an augmented beam problem and ask pertinent questions and hence formulate appropriate assumptions in order to offer resolutions.
Resumo:
Purpose: The purpose of the research described in this paper is to disentangle the rhetoric from the reality in relation to supply chain management (SCM) adoption in practice. There is significant evidence of a divergence between theory and practice in the field of SCM. Research Approach: The authors’ review of the extant SCM literature highlighted a lack of replication studies in SCM, leading to the concept of refined replication being developed. The authors conducted a refined replication of the work of Sweeney et al. (2015) where a new SCM definitional construct – the Four Fundamentals – was proposed. The work presented in this article refines the previous study but adopts the same three-phase approach: focussed interviews, a questionnaire survey, and focus groups. This article covers the second phase of the refined replication study and describes an integrated research design of a questionnaire research to be undertaken in Britain. Findings and Originality: The article presents an integrated research design of a questionnaire research with emphases on the refined replication of previous work of Sweeney et al. (2015) carried out in Ireland and adapting it to the British context. Research Impact: The authors introduce the concept of refined replication in SCM research. This allows previous research to be built upon in order to test understanding of SCM theory and its practical implementation - based on the Four Fundamentals construct - among SCM professionals in Britain. Practical Impact: The article presents the integrated research design of a questionnaire research that may be used in similar studies.
Resumo:
There are many sociopolitical theories to help explain why governments and actors do what they do. Securitization Theory is a process-oriented theory in international relations that focuses on how an actor defines another actor as an “existential threat,” and the resulting responses that can be taken in order to address that threat. While Securitization Theory is an acceptable method to analyze the relationships between actors in the international system, this thesis contends that the proper examination is multi-factorial, focusing on the addition of Role Theory to the analysis. Consideration of Role Theory, which is another international relations theory that explains how an actor’s strategies, relationships, and perceptions by others is based on pre-conceptualized definitions of that actor’s identity, is essential in order to fully explain why an actor might respond to another in a particular way. Certain roles an actor may enact produce a rival relationship with other actors in the system, and it is those rival roles that elicit securitized responses. The possibility of a securitized response lessens when a role or a relationship between roles becomes ambiguous. There are clear points of role rivalry and role ambiguity between Hizb’allah and Iran, which has directly impacted, and continues to impact, how the United States (US) responds to these actors. Because of role ambiguity, the US has still not conceptualized an effective way to deal with Hizb’allah and Iran holistically across all its various areas of operation and in its various enacted roles. It would be overly simplistic to see Hizb’allah and Iran solely through one lens depending on which hemisphere or continent one is observing. The reality is likely more nuanced. Both Role Theory and Securitization theory can help to understand and articulate those nuances. By examining two case studies of Hizb’allah and Iran’s enactment of various roles in both the Middle East and Latin America, the situations where roles cause a securitized response and where the response is less securitized due to role ambiguity will become clear. Using this augmented approach of combining both theories, along with supplementing the manner in which an actor, action, or role is analyzed, will produce better methods for policy-making that will be able to address the more ambiguous activities of Hizb’allah and Iran in these two regions.
Resumo:
This paper presents an economic model of the effects of identity and social norms on consumption patterns. By incorporating qualitative studies in psychology and sociology, I propose a utility function that features two components – economic (functional) and identity elements. This setup is extended to analyze a market comprising a continuum of consumers, whose identity distribution along a spectrum of binary identities is described by a Beta distribution. I also introduce the notion of salience in the context of identity and consumption decisions. The key result of the model suggests that fundamental economic parameters, such as price elasticity and market demand, can be altered by identity elements. In addition, it predicts that firms in perfectly competitive markets may associate their products with certain types of identities, in order to reduce product substitutability and attain price-setting power.
Resumo:
The first edition of Global Value Chain Analysis: A Primer was released five years ago (May 2011) in order to provide an overview of the key concepts and methodological tools used by Duke University’s Center on Globalization, Governance & Competitiveness (Duke CGGC) a university-based research center that focuses on innovative applications of the GVC framework, which was developed by Duke CGGC’s founding director, Gary Gereffi. The Second Edition of Global Value Chain Analysis: A Primer (July 2016) retains a simple, expository style and use of recent research examples in order to offer an entry point for those wishing to better understand and use the GVC framework as a tool to analyze how local actors (firms, communities, workers) are linked to and affected by major transformations in the global economy. The GVC framework focuses on structural shifts in global industries, anchored by the core concepts of governance and upgrading. This Second Edition highlights some of the refinements in these concepts, and introduces a number of new illustrations drawing from recent Duke CGGC research. The bibliography offers a sampling of the broad array of studies available on the Duke CGGC website and in related academic publications. We hope this work stimulates continued interest in and use of the GVC framework as a tool to promote more dynamic, inclusive and sustainable development outcomes for all economies and the local actors within them.
Resumo:
The pharmaceutical industry wields disproportionate power and control within the medical economy of knowledge where the desire for profit considerably outweighs health for its own sake. Utilizing the theoretical tools of political philosophy, this project restructures the economy of medical knowledge in order to lessen the oligarchical control possessed by the pharmaceutical industry. Ultimately, this project argues that an economy of medical knowledge structured around communitarian political theory lessens the current power dynamic without taking an anti-capitalist stance. Arising from the core commitments of communitarian-liberalism, the production, distribution, and consumption of medical knowledge all become guided processes seeking to realize the common good of quality healthcare. This project also considers two other theoretical approaches: liberalism and egalitarianism. A Medical knowledge economy structured around liberal political theory is ultimately rejected as it empowers the oligarchical status quo. Egalitarian political theory is able to significantly reduce the power imbalance problem but simultaneously renders inconsequential medical knowledge; therefore, it is also rejected.