990 resultados para Limit State Functions
Resumo:
This work is supported by Bulgarian NFSI, grant No. MM–704/97
Resumo:
The first motivation for this note is to obtain a general version of the following result: let E be a Banach space and f : E → R be a differentiable function, bounded below and satisfying the Palais-Smale condition; then, f is coercive, i.e., f(x) goes to infinity as ||x|| goes to infinity. In recent years, many variants and extensions of this result appeared, see [3], [5], [6], [9], [14], [18], [19] and the references therein. A general result of this type was given in [3, Theorem 5.1] for a lower semicontinuous function defined on a Banach space, through an approach based on an abstract notion of subdifferential operator, and taking into account the “smoothness” of the Banach space. Here, we give (Theorem 1) an extension in a metric setting, based on the notion of slope from [11] and coercivity is considered in a generalized sense, inspired by [9]; our result allows to recover, for example, the coercivity result of [19], where a weakened version of the Palais-Smale condition is used. Our main tool (Proposition 1) is a consequence of Ekeland’s variational principle extending [12, Corollary 3.4], and deals with a function f which is, in some sense, the “uniform” Γ-limit of a sequence of functions.
Resumo:
2000 Mathematics Subject Classification: Primary 47A48, Secondary 60G12
Resumo:
Марусия Н. Славчова-Божкова - В настоящата работа се обобщава една гранична теорема за докритичен многомерен разклоняващ се процес, зависещ от възрастта на частиците с два типа имиграция. Целта е да се обобщи аналогичен резултат в едномерния случай като се прилагат “coupling” метода, теория на възстановяването и регенериращи процеси.
Resumo:
AMS Subject Classification 2010: 41A25, 41A27, 41A35, 41A36, 41A40, 42Al6, 42A85.
Resumo:
Governmental accountability is the requirement of government entities to be accountable to the citizenry in order to justify the raising and expenditure of public resources. The concept of service efforts and accomplishments measurement for government programs was introduced by the Governmental Accounting Standards Board (GASB) in Service Efforts and Accomplishments Reporting: Its Time Has Come (1990). This research tested the feasibility of implementing the concept for the Federal-aid highway construction program and identified factors affecting implementation with a case study of the District of Columbia. Changes in condition and performance ratings for specific highway segments in 15 projects, before and after construction expenditures, were evaluated using data provided by the Federal Highway Administration. The results of the evaluation indicated difficulty in drawing conclusions on the state program performance, as a whole. The state program reflects problems within the Federally administered program that severely limit implementation of outcome-oriented performance measurement. Major problems identified with data acquisition are: data reliability, availability, compatibility and consistency among states. Other significant factors affecting implementation are institutional barriers and political barriers. Institutional issues in the Federal Highway Administration include the lack of integration of the fiscal project specific database with the Highway Performance Monitoring System database. The Federal Highway Administration has the ability to resolve both of the data problems, however interviews with key Federal informants indicate this will not occur without external directives and changes to the Federal “stewardship” approach to program administration. ^ The findings indicate many issues must be resolved for successful implementation of outcome-oriented performance measures in the Federal-aid construction program. The issues are organizational and political in nature, however in the current environment resolution is possible. Additional research is desirable and would be useful in overcoming the obstacles to successful implementation. ^
Resumo:
Crash reduction factors (CRFs) are used to estimate the potential number of traffic crashes expected to be prevented from investment in safety improvement projects. The method used to develop CRFs in Florida has been based on the commonly used before-and-after approach. This approach suffers from a widely recognized problem known as regression-to-the-mean (RTM). The Empirical Bayes (EB) method has been introduced as a means to addressing the RTM problem. This method requires the information from both the treatment and reference sites in order to predict the expected number of crashes had the safety improvement projects at the treatment sites not been implemented. The information from the reference sites is estimated from a safety performance function (SPF), which is a mathematical relationship that links crashes to traffic exposure. The objective of this dissertation was to develop the SPFs for different functional classes of the Florida State Highway System. Crash data from years 2001 through 2003 along with traffic and geometric data were used in the SPF model development. SPFs for both rural and urban roadway categories were developed. The modeling data used were based on one-mile segments that contain homogeneous traffic and geometric conditions within each segment. Segments involving intersections were excluded. The scatter plots of data show that the relationships between crashes and traffic exposure are nonlinear, that crashes increase with traffic exposure in an increasing rate. Four regression models, namely, Poisson (PRM), Negative Binomial (NBRM), zero-inflated Poisson (ZIP), and zero-inflated Negative Binomial (ZINB), were fitted to the one-mile segment records for individual roadway categories. The best model was selected for each category based on a combination of the Likelihood Ratio test, the Vuong statistical test, and the Akaike's Information Criterion (AIC). The NBRM model was found to be appropriate for only one category and the ZINB model was found to be more appropriate for six other categories. The overall results show that the Negative Binomial distribution model generally provides a better fit for the data than the Poisson distribution model. In addition, the ZINB model was found to give the best fit when the count data exhibit excess zeros and over-dispersion for most of the roadway categories. While model validation shows that most data points fall within the 95% prediction intervals of the models developed, the Pearson goodness-of-fit measure does not show statistical significance. This is expected as traffic volume is only one of the many factors contributing to the overall crash experience, and that the SPFs are to be applied in conjunction with Accident Modification Factors (AMFs) to further account for the safety impacts of major geometric features before arriving at the final crash prediction. However, with improved traffic and crash data quality, the crash prediction power of SPF models may be further improved.
Resumo:
In response to a crime epidemic afflicting Latin America since the early 1990s, several countries in the region have resorted to using heavy-force police or military units to physically retake territories de facto controlled by non-State criminal or insurgent groups. After a period of territory control, the heavy forces hand law enforcement functions in the retaken territories to regular police officers, with the hope that the territories and their populations will remain under the control of the state. To a varying degree, intensity, and consistency, Brazil, Colombia, Mexico, and Jamaica have adopted such policies since the mid-1990s. During such operations, governments need to pursue two interrelated objectives: to better establish the state’s physical presence and to realign the allegiance of the population in those areas toward the state and away from the non-State criminal entities. From the perspective of law enforcement, such operations entail several critical decisions and junctions, such as: Whether or not to announce the force insertion in advance. The decision trades off the element of surprise and the ability to capture key leaders of the criminal organizations against the ability to minimize civilian casualties and force levels. The latter, however, may allow criminals to go to ground and escape capture. Governments thus must decide whether they merely seek to displace criminal groups to other areas or maximize their decapitation capacity. Intelligence flows rarely come from the population. Often, rival criminal groups are the best source of intelligence. However, cooperation between the State and such groups that goes beyond using vetted intelligence provided by the groups, such as a State tolerance for militias, compromises the rule-of-law integrity of the State and ultimately can eviscerate even public safety gains. Sustaining security after initial clearing operations is at times even more challenging than conducting the initial operations. Although unlike the heavy forces, traditional police forces, especially if designed as community police, have the capacity to develop trust of the community and ultimately focus on crime prevention, developing such trust often takes a long time. To develop the community’s trust, regular police forces need to conduct frequent on-foot patrols with intensive nonthreatening interactions with the population and minimize the use of force. Moreover, sufficiently robust patrol units need to be placed in designated beats for substantial amount of time, often at least over a year. Establishing oversight mechanisms, including joint police-citizens’ boards, further facilities building trust in the police among the community. After disruption of the established criminal order, street crime often significantly rises and both the heavy-force and community-police units often struggle to contain it. The increase in street crime alienates the population of the retaken territory from the State. Thus developing a capacity to address street crime is critical. Moreover, the community police units tend to be vulnerable (especially initially) to efforts by displaced criminals to reoccupy the cleared territories. Losing a cleared territory back to criminal groups is extremely costly in terms of losing any established trust and being able to recover it. Rather than operating on a priori determined handover schedule, a careful assessment of the relative strength of regular police and criminal groups post-clearing operations is likely to be a better guide for timing the handover from heavy forces to regular police units. Cleared territories often experience not only a peace dividend, but also a peace deficit – in the rise new serious crime (in addition to street crime). Newly – valuable land and other previously-inaccessible resources can lead to land speculation and forced displacement; various other forms of new crime can also significantly rise. Community police forces often struggle to cope with such crime, especially as it is frequently linked to legal business. Such new crime often receives little to no attention in the design of the operations to retake territories from criminal groups. But without developing an effective response to such new crime, the public safety gains of the clearing operations can be altogether lost.
Resumo:
In 2010, the American Association of State Highway and Transportation Officials (AASHTO) released a safety analysis software system known as SafetyAnalyst. SafetyAnalyst implements the empirical Bayes (EB) method, which requires the use of Safety Performance Functions (SPFs). The system is equipped with a set of national default SPFs, and the software calibrates the default SPFs to represent the agency's safety performance. However, it is recommended that agencies generate agency-specific SPFs whenever possible. Many investigators support the view that the agency-specific SPFs represent the agency data better than the national default SPFs calibrated to agency data. Furthermore, it is believed that the crash trends in Florida are different from the states whose data were used to develop the national default SPFs. In this dissertation, Florida-specific SPFs were developed using the 2008 Roadway Characteristics Inventory (RCI) data and crash and traffic data from 2007-2010 for both total and fatal and injury (FI) crashes. The data were randomly divided into two sets, one for calibration (70% of the data) and another for validation (30% of the data). The negative binomial (NB) model was used to develop the Florida-specific SPFs for each of the subtypes of roadway segments, intersections and ramps, using the calibration data. Statistical goodness-of-fit tests were performed on the calibrated models, which were then validated using the validation data set. The results were compared in order to assess the transferability of the Florida-specific SPF models. The default SafetyAnalyst SPFs were calibrated to Florida data by adjusting the national default SPFs with local calibration factors. The performance of the Florida-specific SPFs and SafetyAnalyst default SPFs calibrated to Florida data were then compared using a number of methods, including visual plots and statistical goodness-of-fit tests. The plots of SPFs against the observed crash data were used to compare the prediction performance of the two models. Three goodness-of-fit tests, represented by the mean absolute deviance (MAD), the mean square prediction error (MSPE), and Freeman-Tukey R2 (R2FT), were also used for comparison in order to identify the better-fitting model. The results showed that Florida-specific SPFs yielded better prediction performance than the national default SPFs calibrated to Florida data. The performance of Florida-specific SPFs was further compared with that of the full SPFs, which include both traffic and geometric variables, in two major applications of SPFs, i.e., crash prediction and identification of high crash locations. The results showed that both SPF models yielded very similar performance in both applications. These empirical results support the use of the flow-only SPF models adopted in SafetyAnalyst, which require much less effort to develop compared to full SPFs.
Resumo:
It was recently shown [Phys. Rev. Lett. 110, 227201 (2013)] that the critical behavior of the random-field Ising model in three dimensions is ruled by a single universality class. This conclusion was reached only after a proper taming of the large scaling corrections of the model by applying a combined approach of various techniques, coming from the zero-and positive-temperature toolboxes of statistical physics. In the present contribution we provide a detailed description of this combined scheme, explaining in detail the zero-temperature numerical scheme and developing the generalized fluctuation-dissipation formula that allowed us to compute connected and disconnected correlation functions of the model. We discuss the error evolution of our method and we illustrate the infinite limit-size extrapolation of several observables within phenomenological renormalization. We present an extension of the quotients method that allows us to obtain estimates of the critical exponent a of the specific heat of the model via the scaling of the bond energy and we discuss the self-averaging properties of the system and the algorithmic aspects of the maximum-flow algorithm used.
Resumo:
Limit-periodic (LP) structures exhibit a type of nonperiodic order yet to be found in a natural material. A recent result in tiling theory, however, has shown that LP order can spontaneously emerge in a two-dimensional (2D) lattice model with nearest-and next-nearest-neighbor interactions. In this dissertation, we explore the question of what types of interactions can lead to a LP state and address the issue of whether the formation of a LP structure in experiments is possible. We study emergence of LP order in three-dimensional (3D) tiling models and bring the subject into the physical realm by investigating systems with realistic Hamiltonians and low energy LP states. Finally, we present studies of the vibrational modes of a simple LP ball and spring model whose results indicate that LP materials would exhibit novel physical properties.
A 2D lattice model defined on a triangular lattice with nearest- and next-nearest-neighbor interactions based on the Taylor-Socolar (TS) monotile is known to have a LP ground state. The system reaches that state during a slow quench through an infinite sequence of phase transitions. Surprisingly, even when the strength of the next-nearest-neighbor interactions is zero, in which case there is a large degenerate class of both crystalline and LP ground states, a slow quench yields the LP state. The first study in this dissertation introduces 3D models closely related to the 2D models that exhibit LP phases. The particular 3D models were designed such that next-nearest-neighbor interactions of the TS type are implemented using only nearest-neighbor interactions. For one of the 3D models, we show that the phase transitions are first order, with equilibrium structures that can be more complex than in the 2D case.
In the second study, we investigate systems with physical Hamiltonians based on one of the 2D tiling models with the goal of stimulating attempts to create a LP structure in experiments. We explore physically realizable particle designs while being mindful of particular features that may make the assembly of a LP structure in an experimental system difficult. Through Monte Carlo (MC) simulations, we have found that one particle design in particular is a promising template for a physical particle; a 2D system of identical disks with embedded dipoles is observed to undergo the series of phase transitions which leads to the LP state.
LP structures are well ordered but nonperiodic, and hence have nontrivial vibrational modes. In the third section of this dissertation, we study a ball and spring model with a LP pattern of spring stiffnesses and identify a set of extended modes with arbitrarily low participation ratios, a situation that appears to be unique to LP systems. The balls that oscillate with large amplitude in these modes live on periodic nets with arbitrarily large lattice constants. By studying periodic approximants to the LP structure, we present numerical evidence for the existence of such modes, and we give a heuristic explanation of their structure.
Resumo:
Ocean acidification (OA) is predicted to play a major role in shaping species biogeography and marine biodiversity over the next century. We tested the effect of medium-term exposure to OA (pH 8.00, 7.30 and 6.70 for 30 d) on acid-base balance in the decapod crab Necora puber-a species that is known to possess good extracellular buffering ability during short-term exposure to hypercapnic conditions. To determine if crabs undergo physiological trade-offs in order to buffer their haemolymph, we characterised a number of fundamental physiological functions, i.e. metabolic rate, tolerance to heat, carapace and chelae [Ca2+] and [Mg2+], haemolymph [Ca2+] and [Mg2+], and immune response in terms of lipid peroxidation. Necora puber was able to buffer changes to extracellular pH over 30 d exposure to hypercapnic water, with no evidence of net shell dissolution, thus demonstrating that HCO3- is actively taken up from the surrounding water. In addition, tolerance to heat, carapace mineralization, and aspects of immune response were not affected by hypercapnic conditions. In contrast, whole-animal O2uptake significantly decreased with hypercapnia, while significant increases in haemolymph [Ca2+] and [Mg2+] and chelae [Mg2+] were observed with hypercapnia. Our results confirm that most physiological functions in N. puber are resistant to low pH/hypercapnia over a longer period than previously investigated, although such resistance comes at the expenses of metabolic rates, haemolymph chemistry and chelae mineralization.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The Iowa State Planning Board is a fact finding group, these reports on the functions of the various organizations have been gathered and are presented with the purpose of showing the extent of their activities in the state of Iowa This is a summary of the activities, programs, policies, and progress of the federal agencies interested in housing and taken as a brief from reports included in "Housing Officials' Yearbook - 1935", published by the National Association of Housing.
Resumo:
Cette thèse examine le rôle du pouvoir de marché dans le marché bancaire. L’emphase est mis sur la prise de risque, les économies d’échelle, l’efficacité économique du marché et la transmission des chocs. Le premier chapitre présente un modèle d’équilibre général dynamique stochastique en économie ouverte comprenant un marché bancaire en concurrence monopolistique. Suivant l’hypothèse de Krugman (1979, 1980) sur la relation entre les économies d’échelle et les exportations, les banques doivent défrayer un coût de transaction pour échanger à l’étranger qui diminue à mesure que le volume de leurs activités locales augmente. Cela incite les banques à réduire leur marge locale afin de profiter davantage du marché extérieur. Le modèle est solutionné et simulé pour divers degrés de concentration dans le marché bancaire. Les résultats obtenus indiquent que deux forces contraires, les économies d’échelle et le pouvoir de marché, s’affrontent lorsque le marché se concentre. La concentration permet aussi aux banques d’accroître leurs activités étrangères, ce qui les rend en contrepartie plus vulnérables aux chocs extérieurs. Le deuxième chapitre élabore un cadre de travail semblable, mais à l’intérieur duquel les banques font face à un risque de crédit. Celui-ci est partiellement assuré par un collatéral fourni par les entrepreneurs et peut être limité à l’aide d’un effort financier. Le modèle est solutionné et simulé pour divers degrés de concentration dans le marché bancaire. Les résultats montrent qu’un plus grand pouvoir de marché réduit la taille du marché financier et de la production à l’état stationnaire, mais incite les banques à prendre moins de risques. De plus, les économies dont le marché bancaire est fortement concentré sont moins sensibles à certains chocs puisque les marges plus élevés donnent initialement de la marge de manoeuvre aux banques en cas de chocs négatifs. Cet effet modérateur est éliminé lorsqu’il est possible pour les banques d’entrer et de sortir librement du marché. Une autre extension avec économies d’échelle montre que sous certaines conditions, un marché moyennement concentré est optimal pour l’économie. Le troisième chapitre utilise un modèle en analyse de portefeuille de type Moyenne-Variance afin de représenter une banque détenant du pouvoir de marché. Le rendement des dépôts et des actifs peut varier selon la quantité échangée, ce qui modifie le choix de portefeuille de la banque. Celle-ci tend à choisir un portefeuille dont la variance est plus faible lorsqu’elle est en mesure d’obtenir un rendement plus élevé sur un actif. Le pouvoir de marché sur les dépôts amène un résultat sembable pour un pouvoir de marché modéré, mais la variance finit par augmenter une fois un certain niveau atteint. Les résultats sont robustes pour différentes fonctions de demandes.