907 resultados para Bivariate Normal Distribution


Relevância:

80.00% 80.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J80.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2010 Mathematics Subject Classification: 94A17, 62B10, 62F03.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 62H15, 62H12.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The normal distribution is a useful tool for the statistician, but not everyone knows how to wield it. In an extract from his new book, Chancing It, Robert Matthews explains what can happen when things are far from normal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following our earlier paper on the subject, we present a general closed formula to value the interest savings due to a multi-firm cash-pool system. Assuming normal distribution of the accounts the total savings can be expressed as the product of three independent factors representing the interest spread, the number and the correlation of the firms, and the time-dependent distribution of the cash accounts. We derive analytic results for two special processes one characterizing the initial build-up period and the other describing the mature period. The value gained in the stationary system can be thought of as the interest, paid at the net interest spread rate on the standard deviation of the account. We show that pooling has substantial value already in the transient period. In order to increase the practical relevance of our analysis we discuss possible extensions of our model and we show how real option pricing technics can be applied here.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Prices of U.S. Treasury securities vary over time and across maturities. When the market in Treasurys is sufficiently complete and frictionless, these prices may be modeled by a function time and maturity. A cross-section of this function for time held fixed is called the yield curve; the aggregate of these sections is the evolution of the yield curve. This dissertation studies aspects of this evolution. ^ There are two complementary approaches to the study of yield curve evolution here. The first is principal components analysis; the second is wavelet analysis. In both approaches both the time and maturity variables are discretized. In principal components analysis the vectors of yield curve shifts are viewed as observations of a multivariate normal distribution. The resulting covariance matrix is diagonalized; the resulting eigenvalues and eigenvectors (the principal components) are used to draw inferences about the yield curve evolution. ^ In wavelet analysis, the vectors of shifts are resolved into hierarchies of localized fundamental shifts (wavelets) that leave specified global properties invariant (average change and duration change). The hierarchies relate to the degree of localization with movements restricted to a single maturity at the base and general movements at the apex. Second generation wavelet techniques allow better adaptation of the model to economic observables. Statistically, the wavelet approach is inherently nonparametric while the wavelets themselves are better adapted to describing a complete market. ^ Principal components analysis provides information on the dimension of the yield curve process. While there is no clear demarkation between operative factors and noise, the top six principal components pick up 99% of total interest rate variation 95% of the time. An economically justified basis of this process is hard to find; for example a simple linear model will not suffice for the first principal component and the shape of this component is nonstationary. ^ Wavelet analysis works more directly with yield curve observations than principal components analysis. In fact the complete process from bond data to multiresolution is presented, including the dedicated Perl programs and the details of the portfolio metrics and specially adapted wavelet construction. The result is more robust statistics which provide balance to the more fragile principal components analysis. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Road pricing has emerged as an effective means of managing road traffic demand while simultaneously raising additional revenues to transportation agencies. Research on the factors that govern travel decisions has shown that user preferences may be a function of the demographic characteristics of the individuals and the perceived trip attributes. However, it is not clear what are the actual trip attributes considered in the travel decision- making process, how these attributes are perceived by travelers, and how the set of trip attributes change as a function of the time of the day or from day to day. In this study, operational Intelligent Transportation Systems (ITS) archives are mined and the aggregated preferences for a priced system are extracted at a fine time aggregation level for an extended number of days. The resulting information is related to corresponding time-varying trip attributes such as travel time, travel time reliability, charged toll, and other parameters. The time-varying user preferences and trip attributes are linked together by means of a binary choice model (Logit) with a linear utility function on trip attributes. The trip attributes weights in the utility function are then dynamically estimated for each time of day by means of an adaptive, limited-memory discrete Kalman filter (ALMF). The relationship between traveler choices and travel time is assessed using different rules to capture the logic that best represents the traveler perception and the effect of the real-time information on the observed preferences. The impact of travel time reliability on traveler choices is investigated considering its multiple definitions. It can be concluded based on the results that using the ALMF algorithm allows a robust estimation of time-varying weights in the utility function at fine time aggregation levels. The high correlations among the trip attributes severely constrain the simultaneous estimation of their weights in the utility function. Despite the data limitations, it is found that, the ALMF algorithm can provide stable estimates of the choice parameters for some periods of the day. Finally, it is found that the daily variation of the user sensitivities for different periods of the day resembles a well-defined normal distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This dissertation examines the quality of hazard mitigation elements in a coastal, hazard prone state. I answer two questions. First, in a state with a strong mandate for hazard mitigation elements in comprehensive plans, does plan quality differ among county governments? Second, if such variation exists, what drives this variation? My research focuses primarily on Florida's 35 coastal counties, which are all at risk for hurricane and flood hazards, and all fall under Florida's mandate to have a comprehensive plan that includes a hazard mitigation element. Research methods included document review to rate the hazard mitigation elements of all 35 coastal county plans and subsequent analysis against demographic and hazard history factors. Following this, I conducted an electronic, nationwide survey of planning professionals and academics, informed by interviews of planning leaders in Florida counties. I found that hazard mitigation element quality varied widely among the 35 Florida coastal counties, but were close to a normal distribution. No plans were of exceptionally high quality. Overall, historical hazard effects did not correlate with hazard mitigation element quality, but some demographic variables that are associated with urban populations did. The variance in hazard mitigation element quality indicates that while state law may mandate, and even prescribe, hazard mitigation in local comprehensive plans, not all plans will result in equal, or even adequate, protection for people. Furthermore, the mixed correlations with demographic variables representing social and disaster vulnerability shows that, at least at the county level, vulnerability to hazards does not have a strong effect on hazard mitigation element quality. From a theory perspective, my research is significant because it compares assumptions about vulnerability based on hazard history and demographics to plan quality. The only vulnerability-related variables that appeared to correlate, and at that mildly so, with hazard mitigation element quality, were those typically representing more urban areas. In terms of the theory of Neo-Institutionalism and theories related to learning organizations, my research shows that planning departments appear to have set norms and rules of operating that preclude both significant public involvement and learning from prior hazard events.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The gestation process, in general, is a very important event on a woman’s life and it brings phisical, phisiological and emotional changes, which by itself is an experience full of intense feelings. By late-aged pregnancy we mean those which occurs at the age of 35 or further. The occurance of this type of pregnancy is rising in Brasil and throughout the world, factors such as, better access to birth control resources and the search for financial stability explains the pregnancy delay. Important processes like resilience and social support can help late-aged pregnant women, in a benefical way, to adapt to the gestation process. Resilience is the capacity that a certain individual or group of individuals have to go through an adverse situation, be able to overcome it and become streghtened, transforming it in motivation for its biopsichosocial development. Social support is a complex and dinamic process that involves transactions between individuals and their social networks, meeting the social needs, promoting and complementing the personal resources that they have to face new demands. This research has the intention of raising information about the issues of late-aged pregnant women in the County of Natal- RN, the main objective was to evaluate the resilience indicators and the social support on late-aged pregnant women in the Natal-RN County. A transversal cut, correlational and descriptive research that was done with 150 lateaged pregnant women. The tools that were used were: A form with sociodemographic and gestation info, the scale of resilience and social support. An eletronic spreadsheet sofware (Excel e SPSS 21.0) was used to analize data which helped on the statistics according to its variables and the objective of this work. For the nominal variables, relative frequencies were used and for continuous the Pearson correlation and determination coefficient were used, regarding that; the sample had a normal distribution. The project fulfilled the ethnic aspects prescribed by Resolution 466/12 of the National Health Council, with a favorable decision (356.436/ 2013) of the UFRN Ethics on Research Committee. Most of the pregnant women had a low money income and education level, born in the state of Rio Grande do Norte they had an average age of 37,49 (±2,577), catholic, married, house wives, they had more than one child and were on their third trimester of pregnancy; they also had a low past abortion rate, not having planned their pregnancy, with an average of 4,22 (±2,506) pre-natal appointments, residing with an average of 3,673 (±1,397) people, having used any sort of birth control device and having high indicators of resilience and social support. The correlations kept between resilience, social support and some of the social demographics and gestation variables were considered low. Such data points out the fact that most of these women were in a stable relationship; they hadn’t had a past of abortion, they were involved with some kind of religion, they were not first pregnancy mothers, had an age on which they are not considered inexperienced mothers and even had scored high on the social support scale, these may all possibly be the most contributing factors on development and resilience building on these 35 years or more mothers. We expect that the data and information from this research may add up knowledge, actions and improvements regarding late-aged pregnant women and the pregnancy phenomena in general.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cephalometric analysis is the mensuration of linear and angular measures through demarcation points as distances and lines on teleradiography, and is considered of fundamental importance for diagnosis and orthodontic planning. In this manner, the objective of this research was to compare cephalometric measurements obtained by dentists and radiologists from the analysis of the same radiograph, in a computerized cephalometric analysis program. All research participants marked 18 cephalometric points on a 14-inch notebook computer, as directed by the program itself (Radiocef 2®). From there, they generated 14 cephalometric parameters including skeletal, dental-skeletal, dental and soft tissue. In order to verify the intra-examiner agreement, 10 professionals from each group repeated the marking of the points with a minimum interval of eight days between the two markings. The intra-group variability was calculated based on the coefficients of variation (CV). The comparison between groups was performed using the Student t-test for normally distributed variables, and using the Mann-Whitney test for those with non-normal distribution. In the group of orthodontists, the measurements of Pog and 1-NB, SL, S-Ls Line, S-Li Line and 1.NB showed high internal variability. In the group of radiologists, the same occurred with the values of Pog and 1-NB, S-Ls Line, S-Li Line and 1.NA. In the comparison between groups, all the analyzed linear values and two angular values showed statistically significant differences between radiologists and dentists (p <0.05). According to the results, the interexaminer error in cephalometric analysis requires more attention, but does not come from a specific class of specialists, being either dentists or radiologists.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diabetes Mellitus (DM ) is a complex disease that requires continuous medical care for the reduction of risk factors in addition to glycemic control. The typical hyperglycemia of this disease produces glycosylation of proteins and so the consequence is the accumulation of glycosylation final products in various human tissues, among them, the tendon. The aerobic exercise (AE) and the low level laser therapy (LLLT) have been used to treat tendinopathies in individuals with or without DM. Objective: The aim of this study was to watch the effect of the LLLT and the AE, in association, in partial tenotomy of the tissue repair of the Achilles tendon (AT) of diabetic rats. Methods: 91 animals were utilized and divided in to the following groups: control group (GC), injured control group (GCL), diabetic group (GD), diabetic group LLLT (GD – TLBI), diabetic group trained (GD - EX) and diabetic group trained laser (GD-EX+TLBI). The animals were submitted to intervention with AE, using a protocol with a progressive increase of time (12 to 60 min) and speed of (4 to 9 m/min), and the LLLT (660 nm laser, 10mW, 4 J/cm², single point for 16 seconds, three times for week). It was analyzed morphological, biomechanical and molecular characteristics. For data showing normal distribution was used one-way ANOVA test and post hoc Tukey and data without normal distribution was used Mann Whitney test and post hoc Dunn's. It was accepted p <0.05 for statistical significance Results: The biomechanical tests indicated major improvement in the GC and GD-EX+TLBI groups when compared with the diabetic groups in the following variables: maximum load, strain, absorbed energy, stress, cross section area, elastic modulus and energy density (p<0.05). The analysis through molecular biology indicated that the association of aerobic exercise and LLLT generated an increase of the collagen I gene expression and modulated the expression of the MMP2 and MMP9 (p<0.05). No observed any major improvement in the morphological variable studied. Conclusion: the LLLT associated with aerobic exercise promotes and increase of the mechanical properties, in the control of collagen I gene expression and of the MMP2 and MMP9 of the diabetic rats.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The coexistence of gingival recession (GR) with root coverage indication and non-carious cervical lesions (LCNC) generates the need for a protocol that respects and promotes health of dental and periodontal tissues and allows treatment predictability. The main objectives of this theses were: (1) verify, through clinical evaluations, the connective tissue graft for root coverage on direct and indirect restorations made of ceramic resin; (2) analyze the influence of the battery level of the LED curing unit in the composite resin characteristics; (3) assess the influence of restorative materials, composite resin and ceramics, on the viability of gingival fibroblasts from primary culture. Nine patients with good oral hygiene and occlusal stability diagnosed with LCNCs the anterior teeth including premolars associated with gingival recession (class I and II of Miller) and only gingival recession were selected. After initial clinical examination, occlusal adjustment was performed and the patients had their teeth randomized allocated on direct composite resin restoration of LCNC, polishing and GR treatment with connective tissue graft and advanced coronally flap CR group (n = 15); and indirect ceramic restoration of the LCNC's and GR treatment (CTG+CAF) Group C (n = 15). The GR presented teeth with no clinically formed LCNCs cavity were treated using (CTG+CAF) being the control group (n = 15). Sorption and solubility tests, analysis of the degree of conversion and diametral tensile strength were performed in composite resin samples (n = 10) photoactivated by 100, 50 and 10% battery charge LED unit. The viability of fibroblasts on composite resin, ceramics and dentin disks (n = 3) was examined. Clinical follow-up was performed for three months. The data obtained at different stages were tabulated and subjected to analysis for detection of normal distribution and homogeneity. The results showed that: the LED unit with 10% battery affects the characteristics of the composite resin; restorative materials present biocompatibility with gingival fibroblasts; and the association of surgical and restorative treatment of teeth affected by NCCL and GR presents successful results at 3-month follow-up.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The cleft palate presented by transforming growth factor-β3 (Tgf-β3 ) null mutant mice is caused by altered palatal shelf adhesion, cell proliferation, epithelial-to-mesenchymal transformation and cell death. The expression of epidermal growth factor (EGF), transforming growth factor-β1 ( Tgf-β1 ) and muscle segment homeobox-1 (Msx-1) is modified in the palates of these knockout mice, and the cell proliferation defect is caused by the change in EGF expression. In this study, we aimed to determine whether this change in EGF expression has any effect on the other mechanisms altered in Tgf-β 3 knockout mouse palates. We tested the effect of inhibiting EGF activity in vitro in the knockout palates via the addition of Tyrphostin AG 1478. We also investigated possible interactions between EGF, Tgf-β 1 and Msx-1 in Tgf-β 3 null mouse palate cultures. The results show that the inhibition of EGF activity in Tgf-β 3 null mouse palate cultures improves palatal shelf adhesion and fusion, with a particular effect on cell death, and restores the normal distribution pattern of Msx-1 in the palatal esenchyme. Inhibition of TGF-β 1 does not affect either EGF or Msx-1 expression.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work explores the use of statistical methods in describing and estimating camera poses, as well as the information feedback loop between camera pose and object detection. Surging development in robotics and computer vision has pushed the need for algorithms that infer, understand, and utilize information about the position and orientation of the sensor platforms when observing and/or interacting with their environment.

The first contribution of this thesis is the development of a set of statistical tools for representing and estimating the uncertainty in object poses. A distribution for representing the joint uncertainty over multiple object positions and orientations is described, called the mirrored normal-Bingham distribution. This distribution generalizes both the normal distribution in Euclidean space, and the Bingham distribution on the unit hypersphere. It is shown to inherit many of the convenient properties of these special cases: it is the maximum-entropy distribution with fixed second moment, and there is a generalized Laplace approximation whose result is the mirrored normal-Bingham distribution. This distribution and approximation method are demonstrated by deriving the analytical approximation to the wrapped-normal distribution. Further, it is shown how these tools can be used to represent the uncertainty in the result of a bundle adjustment problem.

Another application of these methods is illustrated as part of a novel camera pose estimation algorithm based on object detections. The autocalibration task is formulated as a bundle adjustment problem using prior distributions over the 3D points to enforce the objects' structure and their relationship with the scene geometry. This framework is very flexible and enables the use of off-the-shelf computational tools to solve specialized autocalibration problems. Its performance is evaluated using a pedestrian detector to provide head and foot location observations, and it proves much faster and potentially more accurate than existing methods.

Finally, the information feedback loop between object detection and camera pose estimation is closed by utilizing camera pose information to improve object detection in scenarios with significant perspective warping. Methods are presented that allow the inverse perspective mapping traditionally applied to images to be applied instead to features computed from those images. For the special case of HOG-like features, which are used by many modern object detection systems, these methods are shown to provide substantial performance benefits over unadapted detectors while achieving real-time frame rates, orders of magnitude faster than comparable image warping methods.

The statistical tools and algorithms presented here are especially promising for mobile cameras, providing the ability to autocalibrate and adapt to the camera pose in real time. In addition, these methods have wide-ranging potential applications in diverse areas of computer vision, robotics, and imaging.