957 resultados para Fixed point theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

AMS subject classification: 90B80.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 53A07, 53A35, 53A10.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a review of methodology for the algorithmic study of some useful models in point process and queueing theory, as discussed in three lectures at the Summer Institute at Sozopol, Bulgaria. We provide references to sources where the extensive details of this work are found. For future investigation, some open problems and new methodological approaches are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ACM Computing Classification System (1998): G.1.2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Emergency managers are faced with critical evacuation decisions. These decisions must balance conflicting objectives as well as high levels of uncertainty. Multi-Attribute Utility Theory (MAUT) provides a framework through which objective trade-offs can be analyzed to make optimal evacuation decisions. This paper is the result of data gathered during the European Commission Project, Evacuation Responsiveness by Government Organizations (ERGO) and outlines a preliminary decision model for the evacuation decision. The illustrative model identifies levels of risk at which point evacuation actions should be taken by emergency managers in a storm surge scenario with forecasts at 12 and 9 hour intervals. The results illustrate how differences in forecast precision affect the optimal evacuation decision. Additional uses for this decision model are also discussed along with improvements to the model through future ERGO data-gathering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a review of the latest developments in one-dimensional (1D) optical wave turbulence (OWT). Based on an original experimental setup that allows for the implementation of 1D OWT, we are able to show that an inverse cascade occurs through the spontaneous evolution of the nonlinear field up to the point when modulational instability leads to soliton formation. After solitons are formed, further interaction of the solitons among themselves and with incoherent waves leads to a final condensate state dominated by a single strong soliton. Motivated by the observations, we develop a theoretical description, showing that the inverse cascade develops through six-wave interaction, and that this is the basic mechanism of nonlinear wave coupling for 1D OWT. We describe theory, numerics and experimental observations while trying to incorporate all the different aspects into a consistent context. The experimental system is described by two coupled nonlinear equations, which we explore within two wave limits allowing for the expression of the evolution of the complex amplitude in a single dynamical equation. The long-wave limit corresponds to waves with wave numbers smaller than the electrical coherence length of the liquid crystal, and the opposite limit, when wave numbers are larger. We show that both of these systems are of a dual cascade type, analogous to two-dimensional (2D) turbulence, which can be described by wave turbulence (WT) theory, and conclude that the cascades are induced by a six-wave resonant interaction process. WT theory predicts several stationary solutions (non-equilibrium and thermodynamic) to both the long- and short-wave systems, and we investigate the necessary conditions required for their realization. Interestingly, the long-wave system is close to the integrable 1D nonlinear Schrödinger equation (NLSE) (which contains exact nonlinear soliton solutions), and as a result during the inverse cascade, nonlinearity of the system at low wave numbers becomes strong. Subsequently, due to the focusing nature of the nonlinearity, this leads to modulational instability (MI) of the condensate and the formation of solitons. Finally, with the aid of the probability density function (PDF) description of WT theory, we explain the coexistence and mutual interactions between solitons and the weakly nonlinear random wave background in the form of a wave turbulence life cycle (WTLC).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The “Nash program” initiated by Nash (Econometrica 21:128–140, 1953) is a research agenda aiming at representing every axiomatically determined cooperative solution to a game as a Nash outcome of a reasonable noncooperative bargaining game. The L-Nash solution first defined by Forgó (Interactive Decisions. Lecture Notes in Economics and Mathematical Systems, vol 229. Springer, Berlin, pp 1–15, 1983) is obtained as the limiting point of the Nash bargaining solution when the disagreement point goes to negative infinity in a fixed direction. In Forgó and Szidarovszky (Eur J Oper Res 147:108–116, 2003), the L-Nash solution was related to the solution of multiciteria decision making and two different axiomatizations of the L-Nash solution were also given in this context. In this paper, finite bounds are established for the penalty of disagreement in certain special two-person bargaining problems, making it possible to apply all the implementation models designed for Nash bargaining problems with a finite disagreement point to obtain the L-Nash solution as well. For another set of problems where this method does not work, a version of Rubinstein’s alternative offer game (Econometrica 50:97–109, 1982) is shown to asymptotically implement the L-Nash solution. If penalty is internalized as a decision variable of one of the players, then a modification of Howard’s game (J Econ Theory 56:142–159, 1992) also implements the L-Nash solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation examines one category of international capital flows, private portfolio investments (private refers to the source of capital). There is an overall lack of a coherent and consistent definition of foreign portfolio investment. We clarify these definitional issues.^ Two main questions that pertain to private foreign portfolio investments (FPI) are explored. The first problem is the phenomenon of home preference, often referred to as home bias. Related to this are the observed cross-investment flows between countries that seem to contradict the textbook rendition of private FPI. A description of the theories purporting to resolve the home preference puzzle (and the cross-investment one) are summarized and evaluated. Most of this literature considers investors from major developed countries. I consider--as well--whether investors in less developed countries have home preference.^ The dissertation shows that home preference is indeed pervasive and profound across countries, in both developed and emerging markets. For the U.S., I examine home bias in both equity and bond holdings as well. I find that home bias is greater when we look at equity and bond holdings than equity holdings solely.^ In this dissertation a model is developed to explain home bias. This model is original and fills a gap in the literature as there have been no satisfactory models that handle at the same time both home preference and cross-border holdings in the context of information asymmetries. This model reflects what we see in the data and permits us to reach certain results by the use of comparative statics methods. The model suggests, counter-intuitively, that as the rate of return in a country relative to the world rate of return increases, home preference decreases. In the context of our relatively simple model we ascribe this result to the higher variance of the now higher return for home assets. We also find, this time as intended, that as risk aversion increases, investors diversify further so that home preference decreases.^ The second question that the dissertation deals with is the volatility of private foreign portfolio investment. Countries that are recipients of these flows have been wary of such flows because of their perceived volatility. Often the contrast is made with the perceived absence of volatility in foreign direct investment flows. I analyze the validity of these concerns using first net flow data and then gross flow data. The results show that FPI is not, in relative terms, more volatile than other flows in our sample of eight countries (half were developed countries and the rest were emerging markets).^ The implication therefore is that restricting FPI flows may be harmful in the sense that private capital may not be allocated efficiently worldwide to the detriment of capital poor economies. More to the point, any such restrictions would in fact be misguided. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The National Council Licensure Examination for Registered Nurses (NCLEX-RN) is the examination that all graduates of nursing education programs must pass to attain the title of registered nurse. Currently the NCLEX-RN passing rate is at an all-time low (81%) for first-time test takers (NCSBN, 2004); amidst a nationwide shortage of registered nurses (Glabman, 2001). Because of the critical need to supply greater numbers of professional nurses, and the potential accreditation ramifications that low NCLEX-RN passing rates can have on schools of nursing and graduates, this research study tests the effectiveness of a predictor model. This model is based upon the theoretical framework of McClusky's (1959) theory of margin (ToM), with the hope that students found to be at-risk for NCLEX-RN failure can be identified and remediated prior to taking the actual licensure examination. To date no theory based predictor model has been identified that predicts success on the NCLEX-RN. ^ The model was tested using prerequisite course grades, nursing course grades and scores on standardized examinations for the 2003 associate degree nursing graduates at a urban community college (N = 235). Success was determined through the reporting of pass on the NCLEX-RN examination by the Florida Board of Nursing. Point biserial correlations tested model assumptions regarding variable relationships, while logistic regression was used to test the model's predictive power. ^ Correlations among variables were significant and the model accounted for 66% of variance in graduates' success on the NCLEX-RN with 98% prediction accuracy. Although certain prerequisite course grades and nursing course grades were found to be significant to NCLEX-RN success, the overall model was found to be most predictive at the conclusion of the academic program of study. The inclusion of the RN Assessment Examination, taken during the final semester of course work, was the most significant predictor of NCLEX-RN success. Success on the NCLEX-RN allows graduates to work as registered nurses, reflects positively on a school's academic performance record, and supports the appropriateness of the educational program's goals and objectives. The study's findings support potential other uses of McClusky's theory of margin as a predictor of program outcome in other venues of adult education. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nursing shortages still exist in the U.S. so it is important to determine factors that influence decisions to pursue nursing as a career. This comparative, correlational research study revealed factors that may contribute to, or deter students from choosing nursing as a career. The purpose of this study was to determine factors that contribute to a career choice for nursing based on the concepts of social cognitive career theory (SCCT), self efficacy, outcome expectations, and personal goals, among senior high school students, final year nursing students, and first year nursing students. Based on the results strategies may be developed to recruit a younger pool of students to the nursing profession and to boost retention efforts among those who already made a career choice in nursing. Data were collected using a three part questionnaire developed by the researcher to obtain demographic information and data about the respondents' self efficacy, outcome expectations, and personal goals with regards to nursing as a career. Point bi-serial correlations were used to determine relationships between the variables. ANOVAs and ANCOVAs were computed to determine differences in self efficacy and outcome expectations, among the three groups. Additional descriptive data determined reasons for and against a choice for nursing as a career. Self efficacy and outcome expectations were significantly correlated to career choice among all three groups. The nursing students had higher self efficacy perceptions than the high school students. There were no significant differences in outcome expectations between the three groups. The main categories identified as reasons for choosing nursing as a career were; (a) caring, (b) career and educational advancement, (c) personal accomplishment, (d) proficiency and love of the medical field. Common categories identified for not choosing nursing as a career were; (a) responsibility, (b) liability, (c) lack of respect, and (d) low salary. Other categories regarding not choosing nursing as a career included; (a) the nursing program and (b) professional (c) alternate career choice options and (d) fear of sickness and death. Findings from this study support the tenets of SCCT and may be used to recruit and retain nurses and develop curricula.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research involves the design, development, and theoretical demonstration of models resulting in integrated misbehavior resolution protocols for ad hoc networked devices. Game theory was used to analyze strategic interaction among independent devices with conflicting interests. Packet forwarding at the routing layer of autonomous ad hoc networks was investigated. Unlike existing reputation based or payment schemes, this model is based on repeated interactions. To enforce cooperation, a community enforcement mechanism was used, whereby selfish nodes that drop packets were punished not only by the victim, but also by all nodes in the network. Then, a stochastic packet forwarding game strategy was introduced. Our solution relaxed the uniform traffic demand that was pervasive in other works. To address the concerns of imperfect private monitoring in resource aware ad hoc networks, a belief-free equilibrium scheme was developed that reduces the impact of noise in cooperation. This scheme also eliminated the need to infer the private history of other nodes. Moreover, it simplified the computation of an optimal strategy. The belief-free approach reduced the node overhead and was easily tractable. Hence it made the system operation feasible. Motivated by the versatile nature of evolutionary game theory, the assumption of a rational node is relaxed, leading to the development of a framework for mitigating routing selfishness and misbehavior in Multi hop networks. This is accomplished by setting nodes to play a fixed strategy rather than independently choosing a rational strategy. A range of simulations was carried out that showed improved cooperation between selfish nodes when compared to older results. Cooperation among ad hoc nodes can also protect a network from malicious attacks. In the absence of a central trusted entity, many security mechanisms and privacy protections require cooperation among ad hoc nodes to protect a network from malicious attacks. Therefore, using game theory and evolutionary game theory, a mathematical framework has been developed that explores trust mechanisms to achieve security in the network. This framework is one of the first steps towards the synthesis of an integrated solution that demonstrates that security solely depends on the initial trust level that nodes have for each other.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary goal of this dissertation is to develop point-based rigid and non-rigid image registration methods that have better accuracy than existing methods. We first present point-based PoIRe, which provides the framework for point-based global rigid registrations. It allows a choice of different search strategies including (a) branch-and-bound, (b) probabilistic hill-climbing, and (c) a novel hybrid method that takes advantage of the best characteristics of the other two methods. We use a robust similarity measure that is insensitive to noise, which is often introduced during feature extraction. We show the robustness of PoIRe using it to register images obtained with an electronic portal imaging device (EPID), which have large amounts of scatter and low contrast. To evaluate PoIRe we used (a) simulated images and (b) images with fiducial markers; PoIRe was extensively tested with 2D EPID images and images generated by 3D Computer Tomography (CT) and Magnetic Resonance (MR) images. PoIRe was also evaluated using benchmark data sets from the blind retrospective evaluation project (RIRE). We show that PoIRe is better than existing methods such as Iterative Closest Point (ICP) and methods based on mutual information. We also present a novel point-based local non-rigid shape registration algorithm. We extend the robust similarity measure used in PoIRe to non-rigid registrations adapting it to a free form deformation (FFD) model and making it robust to local minima, which is a drawback common to existing non-rigid point-based methods. For non-rigid registrations we show that it performs better than existing methods and that is less sensitive to starting conditions. We test our non-rigid registration method using available benchmark data sets for shape registration. Finally, we also explore the extraction of features invariant to changes in perspective and illumination, and explore how they can help improve the accuracy of multi-modal registration. For multimodal registration of EPID-DRR images we present a method based on a local descriptor defined by a vector of complex responses to a circular Gabor filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study had the goal of make a dialogue between queer theory and the thoughts of the French philosopher Maurice Merleau-Ponty in the categories of body and sexuality. From this dialogue, other goals were designed, namely: identify possible recurrences of the experience of bodies and queer sexualities, designed under Merleau-Ponty’s perspective, to the knowledge of Physical Education and reflect on this domain of knowledge using the notions of queer epistemology and esthesia. The study had as methodology the phenomenological attitude proposed by Merleau-Ponty and use the reduction as technic of research. Trying linking these thoughts we used the cinema of the Spanish director Pedro Almodóvar as perceptive strategy, an exercise of look as possibility of reading the world and new ways of perceiving the human being. We appreciate three films, namely: All About My Mother (1999), The Skin I Live In (2011) and Bad Education (2004), which put us in touch with bodies and queer sexualities, with the body of esthesia, of the ecstasy, sensations and lived experiences, un type of art whose contours are not fixed or determinable, postulate by Merleau-Ponty. The philosopher, provide a rich conceptual view of the body and their sexual experience, extends and opens horizons of thought and reflection about queer experience, one experience indeterminate and contingent as a singular way of inhabiting the world. Those horizons opened by the philosopher and added to the queer perspective contribute to put in question the modes of knowledge production and the knowledge about body and sexuality in Physical Education. Finally, we point that this theoretical conversation give us clues to reflect about the reverberations of a queer epistemology for Physical Education usiging one type of knowledge guided by esthesia and sensitivity as marks of another scientific rationality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the world population continues to grow past seven billion people and global challenges continue to persist including resource availability, biodiversity loss, climate change and human well-being, a new science is required that can address the integrated nature of these challenges and the multiple scales on which they are manifest. Sustainability science has emerged to fill this role. In the fifteen years since it was first called for in the pages of Science, it has rapidly matured, however its place in the history of science and the way it is practiced today must be continually evaluated. In Part I, two chapters address this theoretical and practical grounding. Part II transitions to the applied practice of sustainability science in addressing the urban heat island (UHI) challenge wherein the climate of urban areas are warmer than their surrounding rural environs. The UHI has become increasingly important within the study of earth sciences given the increased focus on climate change and as the balance of humans now live in urban areas.

In Chapter 2 a novel contribution to the historical context of sustainability is argued. Sustainability as a concept characterizing the relationship between humans and nature emerged in the mid to late 20th century as a response to findings used to also characterize the Anthropocene. Emerging from the human-nature relationships that came before it, evidence is provided that suggests Sustainability was enabled by technology and a reorientation of world-view and is unique in its global boundary, systematic approach and ambition for both well being and the continued availability of resources and Earth system function. Sustainability is further an ambition that has wide appeal, making it one of the first normative concepts of the Anthropocene.

Despite its widespread emergence and adoption, sustainability science continues to suffer from definitional ambiguity within the academe. In Chapter 3, a review of efforts to provide direction and structure to the science reveals a continuum of approaches anchored at either end by differing visions of how the science interfaces with practice (solutions). At one end, basic science of societally defined problems informs decisions about possible solutions and their application. At the other end, applied research directly affects the options available to decision makers. While clear from the literature, survey data further suggests that the dichotomy does not appear to be as apparent in the minds of practitioners.

In Chapter 4, the UHI is first addressed at the synoptic, mesoscale. Urban climate is the most immediate manifestation of the warming global climate for the majority of people on earth. Nearly half of those people live in small to medium sized cities, an understudied scale in urban climate research. Widespread characterization would be useful to decision makers in planning and design. Using a multi-method approach, the mesoscale UHI in the study region is characterized and the secular trend over the last sixty years evaluated. Under isolated ideal conditions the findings indicate a UHI of 5.3 ± 0.97 °C to be present in the study area, the magnitude of which is growing over time.

Although urban heat islands (UHI) are well studied, there remain no panaceas for local scale mitigation and adaptation methods, therefore continued attention to characterization of the phenomenon in urban centers of different scales around the globe is required. In Chapter 5, a local scale analysis of the canopy layer and surface UHI in a medium sized city in North Carolina, USA is conducted using multiple methods including stationary urban sensors, mobile transects and remote sensing. Focusing on the ideal conditions for UHI development during an anticyclonic summer heat event, the study observes a range of UHI intensity depending on the method of observation: 8.7 °C from the stationary urban sensors; 6.9 °C from mobile transects; and, 2.2 °C from remote sensing. Additional attention is paid to the diurnal dynamics of the UHI and its correlation with vegetation indices, dewpoint and albedo. Evapotranspiration is shown to drive dynamics in the study region.

Finally, recognizing that a bridge must be established between the physical science community studying the Urban Heat Island (UHI) effect, and the planning community and decision makers implementing urban form and development policies, Chapter 6 evaluates multiple urban form characterization methods. Methods evaluated include local climate zones (LCZ), national land cover database (NCLD) classes and urban cluster analysis (UCA) to determine their utility in describing the distribution of the UHI based on three standard observation types 1) fixed urban temperature sensors, 2) mobile transects and, 3) remote sensing. Bivariate, regression and ANOVA tests are used to conduct the analyses. Findings indicate that the NLCD classes are best correlated to the UHI intensity and distribution in the study area. Further, while the UCA method is not useful directly, the variables included in the method are predictive based on regression analysis so the potential for better model design exists. Land cover variables including albedo, impervious surface fraction and pervious surface fraction are found to dominate the distribution of the UHI in the study area regardless of observation method.

Chapter 7 provides a summary of findings, and offers a brief analysis of their implications for both the scientific discourse generally, and the study area specifically. In general, the work undertaken does not achieve the full ambition of sustainability science, additional work is required to translate findings to practice and more fully evaluate adoption. The implications for planning and development in the local region are addressed in the context of a major light-rail infrastructure project including several systems level considerations like human health and development. Finally, several avenues for future work are outlined. Within the theoretical development of sustainability science, these pathways include more robust evaluations of the theoretical and actual practice. Within the UHI context, these include development of an integrated urban form characterization model, application of study methodology in other geographic areas and at different scales, and use of novel experimental methods including distributed sensor networks and citizen science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Default invariance is the idea that default does not change at any scale of law and finance. Default is a conserved quantity in a universe where fundamental principles of law and finance operate. It exists at the micro-level as part of the fundamental structure of every financial transaction, and at the macro- level, as a fixed critical point within the relatively stable phases of the law and finance cycle. A key point is that default is equivalent to maximizing uncertainty at the micro-level and at the macro-level, is equivalent to the phase transition where unbearable fluctuations occur in all forms of risk transformation, including maturity, liquidity and credit. As such, default invariance is the glue that links the micro and macro structures of law and finance. In this essay, we apply naïve category theory (NCT), a type of mapping logic, to these types of phenomena. The purpose of using NCT is to introduce a rigorous (but simple) mathematical methodology to law and finance discourse and to show that these types of structural considerations are of prime practical importance and significance to law and finance practitioners. These mappings imply a number of novel areas of investigation. From the micro- structure, three macro-approximations are implied. These approximations form the core analytical framework which we will use to examine the phenomena and hypothesize rules governing law and finance. Our observations from these approximations are grouped into five findings. While the entirety of the five findings can be encapsulated by the three approximations, since the intended audience of this paper is the non-specialist in law, finance and category theory, for ease of access we will illustrate the use of the mappings with relatively common concepts drawn from law and finance, focusing especially on financial contracts, derivatives, Shadow Banking, credit rating agencies and credit crises.