936 resultados para Natural Catastrophe, Property Insurance, Loss Distribution, Truncated Data, Ruin Probability
Resumo:
In previous Statnotes, many of the statistical tests described rely on the assumption that the data are a random sample from a normal or Gaussian distribution. These include most of the tests in common usage such as the ‘t’ test ), the various types of analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’) . In microbiology research, however, not all variables can be assumed to follow a normal distribution. Yeast populations, for example, are a notable feature of freshwater habitats, representatives of over 100 genera having been recorded . Most common are the ‘red yeasts’ such as Rhodotorula, Rhodosporidium, and Sporobolomyces and ‘black yeasts’ such as Aurobasidium pelculans, together with species of Candida. Despite the abundance of genera and species, the overall density of an individual species in freshwater is likely to be low and hence, samples taken from such a population will contain very low numbers of cells. A rare organism living in an aquatic environment may be distributed more or less at random in a volume of water and therefore, samples taken from such an environment may result in counts which are more likely to be distributed according to the Poisson than the normal distribution. The Poisson distribution was named after the French mathematician Siméon Poisson (1781-1840) and has many applications in biology, especially in describing rare or randomly distributed events, e.g., the number of mutations in a given sequence of DNA after exposure to a fixed amount of radiation or the number of cells infected by a virus given a fixed level of exposure. This Statnote describes how to fit the Poisson distribution to counts of yeast cells in samples taken from a freshwater lake.
Resumo:
Recent years large scale natural disasters: (e.g. 2004 Tsunami, 2005 Earthquake in South Asia, 2010 Earthquake in Haiti, 2010 flood in Pakistan, 2011 Earthquake in Japan etc.) have captured international attention and led to the advance of research of disaster management. To cope with these huge impact disasters, the involved stakeholders have to learn how quickly and efficiently the relief organisations are able to respond. After a disaster strikes, it is necessary to get the relief aid to the affected people by the prompt action of relief organisations. This supply chain process has to be very fast and efficient. The purpose of this paper is to define the last mile relief distribution in humanitarian supply chain and develop a logistical framework by identifying the factors that affect this process. Seventeen interviews were conducted with field officers and the data analysed to identify which are the critical factors for last mile relief distribution of disaster relief operation. A framework is presented classifying these factors according to the ability to implement them in an optimisation model of humanitarian logistics.
Resumo:
The first demonstration of a hollow core photonic bandgap fiber (HC-PBGF) suitable for high-rate data transmission in the 2 μm waveband is presented. The fiber has a record low loss for this wavelength region (4.5 dB/km at 1980 nm) and a >150 nm wide surface-mode-free transmission window at the center of the bandgap. Detailed analysis of the optical modes and their propagation along the fiber, carried out using a time-of-flight technique in conjunction with spatially and spectrally resolved (S) imaging, provides clear evidence that the HC-PBGF can be operated as quasi-single mode even though it supports up to four mode groups. Through the use of a custom built Thulium doped fiber amplifier with gain bandwidth closely matched to the fiber's low loss window, error-free 8 Gbit/s transmission in an optically amplified data channel at 2008 nm over 290 m of 19 cell HC-PBGF is reported. © 2013 Optical Society of America.
Resumo:
Worldwide floods have become one of the costliest weather-related hazards, causing large-scale human, economic, and environmental damage during the recent past. Recent years have seen a large number of such flood events around the globe, with Europe and the United Kingdom being no exception. Currently, about one in six properties in England is at risk of flooding (EA, 2009), and the risk is expected to further increase in the future (Evans et al., 2004). Although public spending on community-level flood protection has increased and some properties are protected by such protection schemes, many properties at risk of flooding may still be left without adequate protection. As far as businesses are concerned, this has led to an increased need for implementing strategies for property-level flood protection and business continuity, in order to improve their capacity to survive a flood hazard. Small and medium-sized enterprises (SMEs) constitute a significant portion of the UK business community. In the United Kingdom, more than 99% of private sector enterprises fall within the category of SMEs (BERR, 2008). They account for more than half of employment creation (59%) and turnover generation (52%) (BERR, 2008), and are thus considered the backbone of the UK economy. However, they are often affected disproportionately by natural hazards when compared with their larger counterparts (Tierney and Dahlhamer, 1996; Webb, Tierney, and Dahlhamer, 2000; Alesch et al., 2001) due to their increased vulnerability. Previous research reveals that small businesses are not adequately prepared to cope with the risk of natural hazards and to recover following such events (Tierney and Dahlhamer, 1996; Alesch et al., 2001; Yoshida and Deyle, 2005; Crichton, 2006; Dlugolecki, 2008). For instance, 90% of small businesses do not have adequate insurance coverage for their property (AXA Insurance UK, 2008) and only about 30% have a business continuity plan (Woodman, 2008). Not being adequately protected by community-level flood protection measures as well as property- and business-level protection measures threatens the survival of SMEs, especially those located in flood risk areas. This chapter discusses the potential effects of flood hazards on SMEs and the coping strategies that the SMEs can undertake to ensure the continuity of their business activities amid flood events. It contextualizes this discussion within a survey conducted under the Engineering and Physical Sciences Research Council (EPSRC) funded research project entitled “Community Resilience to Extreme Weather — CREW”.
Resumo:
The longitudinal distribution of the Stokes-component power in a Raman fibre laser with a random distributed feedback and unidirectional pumping is measured. The fibre parameters (linear loss and Rayleigh backscattering coefficient) are calculated based on the distributions obtained. A numerical model is developed to describe the lasing power distribution. The simulation results are in good agreement with the experimental data. © 2012 Kvantovaya Elektronika and Turpion Ltd.
Resumo:
Alzheimer's disease (AD) is an important neurodegenerative disorder causing visual problems in the elderly population. The pathology of AD includes the deposition in the brain of abnormal aggregates of β-amyloid (Aβ) in the form of senile plaques (SP) and abnormally phosphorylated tau in the form of neurofibrillary tangles (NFT). A variety of visual problems have been reported in patients with AD including loss of visual acuity (VA), colour vision and visual fields; changes in pupillary responses to mydriatics, defects in fixation and in smooth and saccadic eye movements; changes in contrast sensitivity and in visual evoked potentials (VEP); and disturbances in complex visual tasks such as reading, visuospatial function, and in the naming and identification of objects. In addition, pathological changes have been observed to affect the eye, visual pathway, and visual cortex in AD. To better understand degeneration of the visual cortex in AD, the laminar distribution of the SP and NFT was studied in visual areas V1 and V2 in 18 cases of AD which varied in disease onset and duration. In area V1, the mean density of SP and NFT reached a maximum in lamina III and in laminae II and III respectively. In V2, mean SP density was maximal in laminae III and IV and NFT density in laminae II and III. The densities of SP in laminae I of V1 and NFT in lamina IV of V2 were negatively correlated with patient age. No significant correlations were observed in any cortical lamina between the density of NFT and disease onset or duration. However, in area V2, the densities of SP in lamina II and lamina V were negatively correlated with disease duration and disease onset respectively. In addition, there were several positive correlations between the densities of SP and NFT in V1 with those in area V2. The data suggest: (1) NFT pathology is greater in area V2 than V1, (2) laminae II/III of V1 and V2 are most affected by the pathology, (3) the formation of SP and NFT in V1 and V2 are interconnected, and (4) the pathology may spread between visual areas via the feed-forward short cortico-cortical connections. © 2012 by Nova Science Publishers, Inc. All rights reserved.
Resumo:
Significant numbers of homes within the UK are at risk of flooding. Although community level flood protection schemes are the first line of defence for mitigating flood risk, not all properties are protectable. Property-Level Flood Protection (PLFP) provides those unprotected homeowners with an approach for protecting their homes from flooding. This study sought to establish why property-level flood protection is needed and secondly assess the extent of take up using Worcester as the study area. An exploratory questionnaire survey was conducted to achieve these objectives. After consultation of available literature it was established that the introduction of PLFP protection provided numerous benefits including limiting the health & psychological effects flooding poses, the direct financial benefits and also the possible influence on gaining flood insurance. Despite the benefits and the recognition given to PLFP by the government it was found that the overall take up of the measures was low, findings which were further backed up by data collected in the study area of Worcester with only 23% of the sample having introduced PLFP measures. Reasoning for the low take up numbers typically included; unawareness of the measures, low risk of flood event, installation costs and inability to introduce due to tenancy. Age was noted as a significant impacting factor in the study area with none of the respondents under 25 suggesting they had “a good amount of knowledge of PLFP measures” even when they claimed their properties to be at risk of flooding. Guidance and support is especially recommended to those who are unable to manage their own flood risk for e.g. social housing/rental tenants.
Resumo:
Nowadays financial institutions due to regulation and internal motivations care more intensively on their risks. Besides previously dominating market and credit risk new trend is to handle operational risk systematically. Operational risk is the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events. First we show the basic features of operational risk and its modelling and regulatory approaches, and after we will analyse operational risk in an own developed simulation model framework. Our approach is based on the analysis of latent risk process instead of manifest risk process, which widely popular in risk literature. In our model the latent risk process is a stochastic risk process, so called Ornstein- Uhlenbeck process, which is a mean reversion process. In the model framework we define catastrophe as breach of a critical barrier by the process. We analyse the distributions of catastrophe frequency, severity and first time to hit, not only for single process, but for dual process as well. Based on our first results we could not falsify the Poisson feature of frequency, and long tail feature of severity. Distribution of “first time to hit” requires more sophisticated analysis. At the end of paper we examine advantages of simulation based forecasting, and finally we concluding with the possible, further research directions to be done in the future.
Resumo:
The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^
Resumo:
Forest disturbances are major sources of carbon dioxide to the atmosphere, and therefore impact global climate. Biogeophysical attributes, such as surface albedo (reflectivity), further control the climate-regulating properties of forests. Using both tower-based and remotely sensed data sets, we show that natural disturbances from wildfire, beetle outbreaks, and hurricane wind throw can significantly alter surface albedo, and the associated radiative forcing either offsets or enhances the CO2 forcing caused by reducing ecosystem carbon sequestration over multiple years. In the examined cases, the radiative forcing from albedo change is on the same order of magnitude as the CO2 forcing. The net radiative forcing resulting from these two factors leads to a local heating effect in a hurricane-damaged mangrove forest in the subtropics, and a cooling effect following wildfire and mountain pine beetle attack in boreal forests with winter snow. Although natural forest disturbances currently represent less than half of gross forest cover loss, that area will probably increase in the future under climate change, making it imperative to represent these processes accurately in global climate models.
Resumo:
In the mid 19th century, Horace Mann insisted that a broad provision of public schooling should take precedence over the liberal education of an elite group. In that regard, his generation constructed a state sponsored common schooling enterprise to educate the masses. More than 100 years later, the institution of public schooling fails to maintain an image fully representative of the ideals of equity and inclusion. Critical theory in educational thought associates the dominant practice of functional schooling with maintenance of the status quo, an unequal distribution of financial, political, and social resources. This study examined the empirical basis for the association of public schooling with the status quo using the most recent and comparable cross-country income inequality data. Multiple regression analysis evaluated the possible relationship between national income inequality change over the period 1985-2005 and variables representative of national measures of education supply in the prior decade. The estimated model of income inequality development attempted to quantify the relationship between education supply factors and subsequent income inequality developments by controlling for economic, demographic, and exogenous factors. The sample included all nations with comparable income inequality data over the measurement period, N = 56. Does public school supply affect national income distribution? The estimated model suggested that an increase in the average years of schooling among the population age 15 years or older, measured over the period 1975-1985, provided a mechanism that resulted in a more equal distribution of income over the period 1985-2005 among low and lower-middle income nations. The model also suggested that income inequality increased less or decreased more in smaller economies and when the percentage of the population age < 15 years grew more slowly over the period 1985-2000. In contrast, this study identified no significant relationship between school supply changes measured over prior periods and income inequality development over the period 1985-2005 among upper-middle and high income nations.
Resumo:
Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.
Resumo:
Age-related macular degeneration (AMD) is the leading cause of blindness inAmerica. The fact that AMD wreaks most of the damage in the center of the retina raises the question of whether light, integrated over long periods, is more concentrated in the macula. A method, based on eye-tracking, was developed to measure the distribution of light in the retina under natural viewing conditions. The hypothesis was that integrated over time, retinal illumination peaked in the macula. Additionally a possible relationship between age and retinal illumination was investigated. The eye tracker superimposed the subject's gaze position on a video recorded by a scene camera. Five informed subjects were employed in feasibility tests, and 58 naïve subjects participated in 5 phases. In phase 1 the subjects viewed a gray-scale image. In phase 2, they observed a sequence of photographic images. In phase 3 they viewed a video. In phase 4, they worked on a computer; in phase 5, the subjects walked around freely. The informed subjects were instructed to gaze at bright objects in the field of view and then at dark objects. Naïve subjects were allowed to gaze freely for all phases. Using the subject's gaze coordinates, and the video provided by the scene camera, the cumulative light distribution on the retina was calculated for ∼15° around the fovea. As expected for control subjects, cumulative retinal light distributions peaked and dipped in the fovea when they gazed at bright or dark objects respectively. The light distribution maps obtained from the naïve subjects presented a tendency to peak in the macula for phases 1, 2, and 3, a consistent tendency in phase 4 and a variable tendency in phase 5. The feasibility of using an eye-tracker system to measure the distribution of light in the retina was demonstrated, thus helping to understand the role played by light exposure in the etiology of AMD. Results showed that a tendency for light to peak in the macula is a characteristic of some individuals and of certain tasks. In these situations, risk of AMD could be increased. No significant difference was observed based on age.
Resumo:
This dissertation deals with the constitutional limits on the exercise of patent rights and its effects on the oil, natural gas and biofuels. Held with the support of ANP / PETROBRAS, It seeks to show how the law will limit the exercise of industrial property, based on a reinterpretation of private law by the constitutional development perspective . Today it is a fact that Petrobras, a Brazilian joint venture, has the latest technology in various sectors of the oil industry, and is one of the highest investments in developing new technologies. The overall objective of this thesis is to establish the relationship between the public interest of the Petroleum Industry, Natural Gas and Biofuels and constitutional limits to the free exercise of patent rights, then confirm or refute our hypothesis that Article 71 on Industrial Property Law is contrary to the existing objectives in Article 3 of the Constitution of the Federative Republic of Brazil. The research aims to examine the relevant aspects of the legal nature attributed to IPGN constitutionally confronting the constitutional limits on the free exercise of patent rights, with the purpose to outline the state of the performance limits in the regulation of the economy, in particular the application of feasibility limitations on the right of property in favor of national interest on the strategic energy industry. The aim is to confront the fundamental rights to property and economic development, against the public interest, limiting these first. As to the objectives, the research will be theoretical and descriptive and harvest of industrial property, respect the possible impact of regulatory standards and limiting the right of ownership in the oil industry. To establish how the state will mitigate the intellectual property right, we discuss, at first, a definition of public interest from the general theory of state and sovereign character in order to establish a new concept of national interest and popular interest, which will in turn the definition of our concept of public interest. In the second phase, will be addressed the issue of industrial property rights and how to will be free exercise thereof, in the constitutional sphere, infra, and demonstrating the use of industrial property rights with examples of market and IPGN . After situating the industrial property rights in the constitution and national legislation, establish their relationship with the national and regional development, will be addressed in this chapter in particular the patent law, as most usual form of intellectual property protection in IPGN. Used a study highlighting the number of patents in the area of the analyzed industry, demonstrating with hard data the importance of a sector for industrial development. The relationship between the social function of intellectual property and the constitutional objective of development was characterized to demonstrate the strategic nature of oil to Brazil in the national and international scene, and put into question the hypothesis of the research which provides that even with large investments the lack of legal certainty in the sector turns out not to have a considerable volume of investment as it could.
Resumo:
The purpose of this research is to analyze different daylighting systems in schools in the city of Natal/RN. Although with the abundantly daylight available locally, there are a scarce and diffuse architectural recommendations relating sky conditions, dimensions of daylight systems, shading, fraction of sky visibility, required illuminance, glare, period of occupation and depth of the lit area. This research explores different selected apertures systems to explore the potential of natural light for each system. The method has divided into three phases: The first phase is the modeling which involves the construction of three-dimensional model of a classroom in Sketchup software 2014, which is featured in follow recommendations presented in the literature to obtain a good quality of environmental comfort in school settings. The second phase is the dynamic performance computer simulation of the light through the Daysim software. The input data are the climate file of 2009 the city of Natal / RN, the classroom volumetry in 3ds format with the assignment of optical properties of each surface, the sensor mapping file and the user load file . The results produced in the simulation are organized in a spreadsheet prepared by Carvalho (2014) to determine the occurrence of useful daylight illuminance (UDI) in the range of 300 to 3000lux and build graphics illuminance curves and contours of UDI to identify the uniformity of distribution light, the need of the minimum level of illuminance and the occurrence of glare.