783 resultados para Fundamentals of computing theory


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a criterion for the validity of semiclassical gravity (SCG) which is based on the stability of the solutions of SCG with respect to quantum metric fluctuations. We pay special attention to the two-point quantum correlation functions for the metric perturbations, which contain both intrinsic and induced fluctuations. These fluctuations can be described by the Einstein-Langevin equation obtained in the framework of stochastic gravity. Specifically, the Einstein-Langevin equation yields stochastic correlation functions for the metric perturbations which agree, to leading order in the large N limit, with the quantum correlation functions of the theory of gravity interacting with N matter fields. The homogeneous solutions of the Einstein-Langevin equation are equivalent to the solutions of the perturbed semiclassical equation, which describe the evolution of the expectation value of the quantum metric perturbations. The information on the intrinsic fluctuations, which are connected to the initial fluctuations of the metric perturbations, can also be retrieved entirely from the homogeneous solutions. However, the induced metric fluctuations proportional to the noise kernel can only be obtained from the Einstein-Langevin equation (the inhomogeneous term). These equations exhibit runaway solutions with exponential instabilities. A detailed discussion about different methods to deal with these instabilities is given. We illustrate our criterion by showing explicitly that flat space is stable and a description based on SCG is a valid approximation in that case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Photon migration in a turbid medium has been modeled in many different ways. The motivation for such modeling is based on technology that can be used to probe potentially diagnostic optical properties of biological tissue. Surprisingly, one of the more effective models is also one of the simplest. It is based on statistical properties of a nearest-neighbor lattice random walk. Here we develop a theory allowing one to calculate the number of visits by a photon to a given depth, if it is eventually detected at an absorbing surface. This mimics cw measurements made on biological tissue and is directed towards characterizing the depth reached by photons injected at the surface. Our development of the theory uses formalism based on the theory of a continuous-time random walk (CTRW). Formally exact results are given in the Fourier-Laplace domain, which, in turn, are used to generate approximations for parameters of physical interest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of free reserves in a non-life insurance portfolio as defined in the classical model of risk theory is modified by the introduction of dividend policies that set maximum levels for the accumulation of reserves. The first part of the work formulates the quantification of the dividend payments via the expectation of their current value under diferent hypotheses. The second part presents a solution based on a system of linear equations for discrete dividend payments in the case of a constant dividend barrier, illustrated by solving a specific case.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Improving safety at nighttime work zones is important because of the extra visibility concerns. The deployment of sequential lights is an innovative method for improving driver recognition of lane closures and work zone tapers. Sequential lights are wireless warning lights that flash in a sequence to clearly delineate the taper at work zones. The effectiveness of sequential lights was investigated using controlled field studies. Traffic parameters were collected at the same field site with and without the deployment of sequential lights. Three surrogate performance measures were used to determine the impact of sequential lights on safety. These measures were the speeds of approaching vehicles, the number of late taper merges and the locations where vehicles merged into open lane from the closed lane. In addition, an economic analysis was conducted to monetize the benefits and costs of deploying sequential lights at nighttime work zones. The results of this study indicates that sequential warning lights had a net positive effect in reducing the speeds of approaching vehicles, enhancing driver compliance, and preventing passenger cars, trucks and vehicles at rural work zones from late taper merges. Statistically significant decreases of 2.21 mph mean speed and 1 mph 85% speed resulted with sequential lights. The shift in the cumulative speed distributions to the left (i.e. speed decrease) was also found to be statistically significant using the Mann-Whitney and Kolmogorov-Smirnov tests. But a statistically significant increase of 0.91 mph in the speed standard deviation also resulted with sequential lights. With sequential lights, the percentage of vehicles that merged earlier increased from 53.49% to 65.36%. A benefit-cost ratio of around 5 or 10 resulted from this analysis of Missouri nighttime work zones and historical crash data. The two different benefitcost ratios reflect two different ways of computing labor costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intracardiac organization indices such as atrial fibrillation (AF) cycle length (AFCL) have been used to track the efficiency of stepwise catheter ablation (step-CA) of longstanding persistent AF, however with limited success. The morphology of AF activation waves reflects the underlying activation patterns. Its temporal evolution is a local organization indicator that could be potentially used for tracking the efficiency of step-CA. We report a new method for characterizing the structure of the temporal evolution of activation wave morphology. Using recurrence plots, novel organization indices are proposed. By computing their relative evolution during the first step of ablation vs baseline, we found that these new parameters are superior to AFCL to track the effect of step-CA "en route" to AF termination.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Phase I research, Iowa Department of Transportation (IDOT) Project HR-214, "Feasibility Study of Strengthening Existing Single Span Steel Beam Concrete Deck Bridges," verified that post-tensioning can be used to provide strengthening of the composite bridges under investigation. Phase II research, reported here, involved the strengthening of two full-scale prototype bridges - one a prototype of the model bridge tested during Phase I and the other larger and skewed. In addition to the field work, Phase II also involved a considerable amount of laboratory work. A literature search revealed that only minimal data existed on the angle-plus-bar shear connectors. Thus, several specimens utilizing angle-plus-bar, as well as channels, studs and high strength bolts as shear connectors were fabricated and tested. To obtain additional shear connector information, the bridge model of Phase I was sawed into four composite concrete slab and steel beam specimens. Two of the resulting specimens were tested with the original shear connection, while the other two specimens had additional shear connectors added before testing. Although orthotropic plate theory was shown in Phase I to predict vertical load distribution in bridge decks and to predict approximate distribution of post-tensioning for right-angle bridges, it was questioned whether the theory could also be used on skewed bridges. Thus, a small plexiglas model was constructed and used in vertical load distribution tests and post-tensioning force distribution tests for verification of the theory. Conclusions of this research are as follows: (1) The capacity of existing shear connectors must be checked as part of a bridge strengthening program. Determination of the concrete deck strength in advance of bridge strengthening is also recommended. (2) The ultimate capacity of angle-plus-bar shear connectors can be computed on the basis of a modified AASHTO channel connector formula and an angle-to-beam weld capacity check. (3) Existing shear connector capacity can be augmented by means of double-nut high strength bolt connectors. (4) Post-tensioning did not significantly affect truck load distribution for right angle or skewed bridges. (5) Approximate post-tensioning and truck load distribution for actual bridges can be predicted by orthotropic plate theory for vertical load; however, the agreement between actual distribution and theoretical distribution is not as close as that measured for the laboratory model in Phase I. (6) The right angle bridge exhibited considerable end restraint at what would be assumed to be simple support. The construction details at bridge abutments seem to be the reason for the restraint. (7) The skewed bridge exhibited more end restraint than the right angle bridge. Both skew effects and construction details at the abutments accounted for the restraint. (8) End restraint in the right angle and skewed bridges reduced tension strains in the steel bridge beams due to truck loading, but also reduced the compression strains caused by post-tensioning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plants such as Arabidopsis thaliana respond to foliar shade and neighbors who may become competitors for light resources by elongation growth to secure access to unfiltered sunlight. Challenges faced during this shade avoidance response (SAR) are different under a light-absorbing canopy and during neighbor detection where light remains abundant. In both situations, elongation growth depends on auxin and transcription factors of the phytochrome interacting factor (PIF) class. Using a computational modeling approach to study the SAR regulatory network, we identify and experimentally validate a previously unidentified role for long hypocotyl in far red 1, a negative regulator of the PIFs. Moreover, we find that during neighbor detection, growth is promoted primarily by the production of auxin. In contrast, in true shade, the system operates with less auxin but with an increased sensitivity to the hormonal signal. Our data suggest that this latter signal is less robust, which may reflect a cost-to-robustness tradeoff, a system trait long recognized by engineers and forming the basis of information theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article studies alterations in the values, attitudes, and behaviors that emerged among U.S. citizens as a consequence of, and as a response to, the attacks of September 11, 2001. The study briefly examines the immediate reaction to the attack, before focusing on the collective reactions that characterized the behavior of the majority of the population between the events of 9/11 and the response to it in the form of intervention in Afghanistan. In studying this period an eight-phase sequential model (Botcharova, 2001) is used, where the initial phases center on the nation as the ingroup and the latter focus on the enemy who carried out the attack as the outgroup. The study is conducted from a psychosocial perspective and uses "social identity theory" (Tajfel & Turner, 1979, 1986) as the basic framework for interpreting and accounting for the collective reactions recorded. The main purpose of this paper is to show that the interpretation of these collective reactions is consistent with the postulates of social identity theory. The application of this theory provides a different and specific analysis of events. The study is based on data obtained from a variety of rigorous academic studies and opinion polls conducted in relation to the events of 9/11. In line with social identity theory, 9/11 had a marked impact on the importance attached by the majority of U.S. citizens to their identity as members of a nation. This in turn accentuated group differentiation and activated ingroup favoritism and outgroup discrimination (Tajfel & Turner, 1979, 1986). Ingroup favoritism strengthened group cohesion, feelings of solidarity, and identification with the most emblematic values of the U.S. nation, while outgroup discrimination induced U.S. citizens to conceive the enemy (al-Qaeda and its protectors) as the incarnation of evil, depersonalizing the group and venting their anger on it, and to give their backing to a military response, the eventual intervention in Afghanistan. Finally, and also in line with the postulates of social identity theory, as an alternative to the virtual bipolarization of the conflict (U.S. vs al-Qaeda), the activation of a higher level of identity in the ingroup is proposed, a group that includes the United States and the largest possible number of countries¿ including Islamic states¿in the search for a common, more legitimate and effective solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article studies alterations in the values, attitudes, and behaviors that emerged among U.S. citizens as a consequence of, and as a response to, the attacks of September 11, 2001. The study briefly examines the immediate reaction to the attack, before focusing on the collective reactions that characterized the behavior of the majority of the population between the events of 9/11 and the response to it in the form of intervention in Afghanistan. In studying this period an eight-phase sequential model (Botcharova, 2001) is used, where the initial phases center on the nation as the ingroup and the latter focus on the enemy who carried out the attack as the outgroup. The study is conducted from a psychosocial perspective and uses "social identity theory" (Tajfel & Turner, 1979, 1986) as the basic framework for interpreting and accounting for the collective reactions recorded. The main purpose of this paper is to show that the interpretation of these collective reactions is consistent with the postulates of social identity theory. The application of this theory provides a different and specific analysis of events. The study is based on data obtained from a variety of rigorous academic studies and opinion polls conducted in relation to the events of 9/11. In line with social identity theory, 9/11 had a marked impact on the importance attached by the majority of U.S. citizens to their identity as members of a nation. This in turn accentuated group differentiation and activated ingroup favoritism and outgroup discrimination (Tajfel & Turner, 1979, 1986). Ingroup favoritism strengthened group cohesion, feelings of solidarity, and identification with the most emblematic values of the U.S. nation, while outgroup discrimination induced U.S. citizens to conceive the enemy (al-Qaeda and its protectors) as the incarnation of evil, depersonalizing the group and venting their anger on it, and to give their backing to a military response, the eventual intervention in Afghanistan. Finally, and also in line with the postulates of social identity theory, as an alternative to the virtual bipolarization of the conflict (U.S. vs al-Qaeda), the activation of a higher level of identity in the ingroup is proposed, a group that includes the United States and the largest possible number of countries¿ including Islamic states¿in the search for a common, more legitimate and effective solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the administration, planning, design, and maintenance of road systems, transportation professionals often need to choose between alternatives, justify decisions, evaluate tradeoffs, determine how much to spend, set priorities, assess how well the network meets traveler needs, and communicate the basis for their actions to others. A variety of technical guidelines, tools, and methods have been developed to help with these activities. Such work aids include design criteria guidelines, design exception analysis methods, needs studies, revenue allocation schemes, regional planning guides, designation of minimum standards, sufficiency ratings, management systems, point based systems to determine eligibility for paving, functional classification, and bridge ratings. While such tools play valuable roles, they also manifest a number of deficiencies and are poorly integrated. Design guides tell what solutions MAY be used, they aren't oriented towards helping find which one SHOULD be used. Design exception methods help justify deviation from design guide requirements but omit consideration of important factors. Resource distribution is too often based on dividing up what's available rather than helping determine how much should be spent. Point systems serve well as procedural tools but are employed primarily to justify decisions that have already been made. In addition, the tools aren't very scalable: a system level method of analysis seldom works at the project level and vice versa. In conjunction with the issues cited above, the operation and financing of the road and highway system is often the subject of criticisms that raise fundamental questions: What is the best way to determine how much money should be spent on a city or a county's road network? Is the size and quality of the rural road system appropriate? Is too much or too little money spent on road work? What parts of the system should be upgraded and in what sequence? Do truckers receive a hidden subsidy from other motorists? Do transportation professions evaluate road situations from too narrow of a perspective? In considering the issues and questions the author concluded that it would be of value if one could identify and develop a new method that would overcome the shortcomings of existing methods, be scalable, be capable of being understood by the general public, and utilize a broad viewpoint. After trying out a number of concepts, it appeared that a good approach would be to view the road network as a sub-component of a much larger system that also includes vehicles, people, goods-in-transit, and all the ancillary items needed to make the system function. Highway investment decisions could then be made on the basis of how they affect the total cost of operating the total system. A concept, named the "Total Cost of Transportation" method, was then developed and tested. The concept rests on four key principles: 1) that roads are but one sub-system of a much larger 'Road Based Transportation System', 2) that the size and activity level of the overall system are determined by market forces, 3) that the sum of everything expended, consumed, given up, or permanently reserved in building the system and generating the activity that results from the market forces represents the total cost of transportation, and 4) that the economic purpose of making road improvements is to minimize that total cost. To test the practical value of the theory, a special database and spreadsheet model of Iowa's county road network was developed. This involved creating a physical model to represent the size, characteristics, activity levels, and the rates at which the activities take place, developing a companion economic cost model, then using the two in tandem to explore a variety of issues. Ultimately, the theory and model proved capable of being used in full system, partial system, single segment, project, and general design guide levels of analysis. The method appeared to be capable of remedying many of the existing work method defects and to answer society's transportation questions from a new perspective.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The article is concerned with the formal definition of a largely unnoticed factor in narrative structure. Based on the assumptions that (1) the semantics of a written text depend, among other factors, directly on its visual alignment in space, that (2) the formal structure of a text has to meet that of its spatial presentation and that (3) these assumptions hold true also for narrative texts (which, however, in modern times typically conceal their spatial dimensions by a low-key linear layout), it is argued that, how ever low-key, the expected material shape of a given narrative determines the configuration of its plot by its author. The ,implied book' thus denotes an author's historically assumable, not necessarily conscious idea of how his text, which is still in the process of creation, will be dimensionally presented and under these circumstances visually absorbed. Assuming that an author's knowledge of this later (potentially) substantiated material form influences the composition, the implied book is to be understood as a text-genetically determined, structuring moment of the text. Historically reconstructed, it thus serves the methodical analysis of structural characteristics of a completed text.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unlike the evaluation of single items of scientific evidence, the formal study and analysis of the jointevaluation of several distinct items of forensic evidence has to date received some punctual, ratherthan systematic, attention. Questions about the (i) relationships among a set of (usually unobservable)propositions and a set of (observable) items of scientific evidence, (ii) the joint probative valueof a collection of distinct items of evidence as well as (iii) the contribution of each individual itemwithin a given group of pieces of evidence still represent fundamental areas of research. To somedegree, this is remarkable since both, forensic science theory and practice, yet many daily inferencetasks, require the consideration of multiple items if not masses of evidence. A recurrent and particularcomplication that arises in such settings is that the application of probability theory, i.e. the referencemethod for reasoning under uncertainty, becomes increasingly demanding. The present paper takesthis as a starting point and discusses graphical probability models, i.e. Bayesian networks, as frameworkwithin which the joint evaluation of scientific evidence can be approached in some viable way.Based on a review of existing main contributions in this area, the article here aims at presentinginstances of real case studies from the author's institution in order to point out the usefulness andcapacities of Bayesian networks for the probabilistic assessment of the probative value of multipleand interrelated items of evidence. A main emphasis is placed on underlying general patterns of inference,their representation as well as their graphical probabilistic analysis. Attention is also drawnto inferential interactions, such as redundancy, synergy and directional change. These distinguish thejoint evaluation of evidence from assessments of isolated items of evidence. Together, these topicspresent aspects of interest to both, domain experts and recipients of expert information, because theyhave bearing on how multiple items of evidence are meaningfully and appropriately set into context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Child sexual abuse is associated with problems in children's emotional development, particularly increased insecurity of attachment. However, few studies have examined its effect on the organization of attachment representations in preschoolers, and the findings of those that have been conducted have not been entirely consistent. Therefore, this study aims to analyze the effect of child sexual abuse on attachment representation quality in a sample of children 3 to 7 years old in Chile. The results indicate child sexual abuse does affect children's attachment representation quality. The attachment narratives of child sexual abuse victims scored significantly higher than nonvictims on the hyperactivity and disorganization dimensions of attachment. These results are discussed in terms of attachment theory, clinical findings on child sexual abuse, and clinical implications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We experimentally question the assertion of Prospect Theory that people display risk attraction in choices involving high-probability losses. Indeed, our experimental participants tend to avoid fair risks for large (up to ? 90), high-probability (80%) losses. Our research hinges on a novel experimental method designed to alleviate the house-money bias that pervades experiments with real (not hypothetical) loses.Our results vindicate Daniel Bernoulli?s view that risk aversion is the dominant attitude,But, contrary to the Bernoulli-inspired canonical expected utility theory, we do find frequent risk attraction for small amounts of money at stake.In any event, we attempt neither to test expected utility versus nonexpected utility theories, nor to contribute to the important literature that estimates value and weighting functions. The question that we ask is more basic, namely: do people display risk aversion when facing large losses, or large gains? And, at the risk of oversimplifying, our answer is yes.