968 resultados para Standard Insurance Company.
Resumo:
Type Ia supernovae, sparked off by exploding white dwarfs of mass close to the Chandrasekhar limit, play the key role in understanding the expansion rate of the Universe. However, recent observations of several peculiar type Ia supernovae argue for its progenitor mass to be significantly super-Chandrasekhar. We show that strongly magnetized white dwarfs not only can violate the Chandrasekhar mass limit significantly, but exhibit a different mass limit. We establish from a foundational level that the generic mass limit of white dwarfs is 2.58 solar mass. This explains the origin of overluminous peculiar type Ia supernovae. Our finding further argues for a possible second standard candle, which has many far reaching implications, including a possible reconsideration of the expansion history of the Universe. DOI: 10.1103/PhysRevLett.110.071102
Resumo:
Subsurface lithology and seismic site classification of Lucknow urban center located in the central part of the Indo-Gangetic Basin (IGB) are presented based on detailed shallow subsurface investigations and borehole analysis. These are done by carrying out 47 seismic surface wave tests using multichannel analysis of surface waves (MASW) and 23 boreholes drilled up to 30 m with standard penetration test (SPT) N values. Subsurface lithology profiles drawn from the drilled boreholes show low- to medium-compressibility clay and silty to poorly graded sand available till depth of 30 m. In addition, deeper boreholes (depth >150 m) were collected from the Lucknow Jal Nigam (Water Corporation), Government of Uttar Pradesh to understand deeper subsoil stratification. Deeper boreholes in this paper refer to those with depth over 150 m. These reports show the presence of clay mix with sand and Kankar at some locations till a depth of 150 m, followed by layers of sand, clay, and Kankar up to 400 m. Based on the available details, shallow and deeper cross-sections through Lucknow are presented. Shear wave velocity (SWV) and N-SPT values were measured for the study area using MASW and SPT testing. Measured SWV and N-SPT values for the same locations were found to be comparable. These values were used to estimate 30 m average values of N-SPT (N-30) and SWV (V-s(30)) for seismic site classification of the study area as per the National Earthquake Hazards Reduction Program (NEHRP) soil classification system. Based on the NEHRP classification, the entire study area is classified into site class C and D based on V-s(30) and site class D and E based on N-30. The issue of larger amplification during future seismic events is highlighted for a major part of the study area which comes under site class D and E. Also, the mismatch of site classes based on N-30 and V-s(30) raises the question of the suitability of the NEHRP classification system for the study region. Further, 17 sets of SPT and SWV data are used to develop a correlation between N-SPT and SWV. This represents a first attempt of seismic site classification and correlation between N-SPT and SWV in the Indo-Gangetic Basin.
Resumo:
Future space-based gravity wave (GW) experiments such as the Big Bang Observatory (BBO), with their excellent projected, one sigma angular resolution, will measure the luminosity distance to a large number of GW sources to high precision, and the redshift of the single galaxies in the narrow solid angles towards the sources will provide the redshifts of the gravity wave sources. One sigma BBO beams contain the actual source in only 68% of the cases; the beams that do not contain the source may contain a spurious single galaxy, leading to misidentification. To increase the probability of the source falling within the beam, larger beams have to be considered, decreasing the chances of finding single galaxies in the beams. Saini et al. T.D. Saini, S.K. Sethi, and V. Sahni, Phys. Rev. D 81, 103009 (2010)] argued, largely analytically, that identifying even a small number of GW source galaxies furnishes a rough distance-redshift relation, which could be used to further resolve sources that have multiple objects in the angular beam. In this work we further develop this idea by introducing a self-calibrating iterative scheme which works in conjunction with Monte Carlo simulations to determine the luminosity distance to GW sources with progressively greater accuracy. This iterative scheme allows one to determine the equation of state of dark energy to within an accuracy of a few percent for a gravity wave experiment possessing a beam width an order of magnitude larger than BBO (and therefore having a far poorer angular resolution). This is achieved with no prior information about the nature of dark energy from other data sets such as type Ia supernovae, baryon acoustic oscillations, cosmic microwave background, etc. DOI:10.1103/PhysRevD.87.083001
Resumo:
It is increasingly being recognized that resting state brain connectivity derived from functional magnetic resonance imaging (fMRI) data is an important marker of brain function both in healthy and clinical populations. Though linear correlation has been extensively used to characterize brain connectivity, it is limited to detecting first order dependencies. In this study, we propose a framework where in phase synchronization (PS) between brain regions is characterized using a new metric ``correlation between probabilities of recurrence'' (CPR) and subsequent graph-theoretic analysis of the ensuing networks. We applied this method to resting state fMRI data obtained from human subjects with and without administration of propofol anesthetic. Our results showed decreased PS during anesthesia and a biologically more plausible community structure using CPR rather than linear correlation. We conclude that CPR provides an attractive nonparametric method for modeling interactions in brain networks as compared to standard correlation for obtaining physiologically meaningful insights about brain function.
Resumo:
The problem addressed in this paper is concerned with an important issue faced by any green aware global company to keep its emissions within a prescribed cap. The specific problem is to allocate carbon reductions to its different divisions and supply chain partners in achieving a required target of reductions in its carbon reduction program. The problem becomes a challenging one since the divisions and supply chain partners, being autonomous, may exhibit strategic behavior. We use a standard mechanism design approach to solve this problem. While designing a mechanism for the emission reduction allocation problem, the key properties that need to be satisfied are dominant strategy incentive compatibility (DSIC) (also called strategy-proofness), strict budget balance (SBB), and allocative efficiency (AE). Mechanism design theory has shown that it is not possible to achieve the above three properties simultaneously. In the literature, a mechanism that satisfies DSIC and AE has recently been proposed in this context, keeping the budget imbalance minimal. Motivated by the observation that SBB is an important requirement, in this paper, we propose a mechanism that satisfies DSIC and SBB with slight compromise in allocative efficiency. Our experimentation with a stylized case study shows that the proposed mechanism performs satisfactorily and provides an attractive alternative mechanism for carbon footprint reduction by global companies.
Resumo:
The effect of structure height on the lightning striking distance is estimated using a lightning strike model that takes into account the effect of connecting leaders. According to the results, the lightning striking distance may differ significantly from the values assumed in the IEC standard for structure heights beyond 30m. However, for structure heights smaller than about 30m, the results show that the values assumed by IEC do not differ significantly from the predictions based on a lightning attachment model taking into account the effect of connecting leaders. However, since IEC assumes a smaller striking distance than the ones predicted by the adopted model one can conclude that the safety is not compromised in adhering to the IEC standard. Results obtained from the model are also compared with Collection Volume Method (CVM) and other commonly used lightning attachment models available in the literature. The results show that in the case of CVM the calculated attractive distances are much larger than the ones obtained using the physically based lightning attachment models. This indicates the possibility of compromising the lightning protection procedures when using CVM. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Seismic site characterization is the basic requirement for seismic microzonation and site response studies of an area. Site characterization helps to gauge the average dynamic properties of soil deposits and thus helps to evaluate the surface level response. This paper presents a seismic site characterization of Agartala city, the capital of Tripura state, in the northeast of India. Seismically, Agartala city is situated in the Bengal Basin zone which is classified as a highly active seismic zone, assigned by Indian seismic code BIS-1893, Indian Standard Criteria for Earthquake Resistant Design of Structures, Part-1 General Provisions and Buildings. According to the Bureau of Indian Standards, New Delhi (2002), it is the highest seismic level (zone-V) in the country. The city is very close to the Sylhet fault (Bangladesh) where two major earthquakes (M (w) > 7) have occurred in the past and affected severely this city and the whole of northeast India. In order to perform site response evaluation, a series of geophysical tests at 27 locations were conducted using the multichannel analysis of surface waves (MASW) technique, which is an advanced method for obtaining shear wave velocity (V (s)) profiles from in situ measurements. Similarly, standard penetration test (SPT-N) bore log data sets have been obtained from the Urban Development Department, Govt. of Tripura. In the collected data sets, out of 50 bore logs, 27 were selected which are close to the MASW test locations and used for further study. Both the data sets (V (s) profiles with depth and SPT-N bore log profiles) have been used to calculate the average shear wave velocity (V (s)30) and average SPT-N values for the upper 30 m depth of the subsurface soil profiles. These were used for site classification of the study area recommended by the National Earthquake Hazard Reduction Program (NEHRP) manual. The average V (s)30 and SPT-N classified the study area as seismic site class D and E categories, indicating that the city is susceptible to site effects and liquefaction. Further, the different data set combinations between V (s) and SPT-N (corrected and uncorrected) values have been used to develop site-specific correlation equations by statistical regression, as `V (s)' is a function of SPT-N value (corrected and uncorrected), considered with or without depth. However, after considering the data set pairs, a probabilistic approach has also been presented to develop a correlation using a quantile-quantile (Q-Q) plot. A comparison has also been made with the well known published correlations (for all soils) available in the literature. The present correlations closely agree with the other equations, but, comparatively, the correlation of shear wave velocity with the variation of depth and uncorrected SPT-N values provides a more suitable predicting model. Also the Q-Q plot agrees with all the other equations. In the absence of in situ measurements, the present correlations could be used to measure V (s) profiles of the study area for site response studies.
Resumo:
With the renewed interest in vector-like fermion extensions of the Standard Model, we present here a study of multiple vector-like theories and their phenomenological implications. Our focus is mostly on minimal flavor conserving theories that couple the vector-like fermions to the SM gauge fields and mix only weakly with SM fermions so as to avoid flavor problems. We present calculations for precision electroweak and vector-like state decays, which are needed to investigate compatibility with currently known data. We investigate the impact of vector-like fermions on Higgs boson production and decay, including loop contributions, in a wide variety of vector-like extensions and their parameter spaces.
Resumo:
We consider the possibility that the heavier CP-even Higgs boson (H-0) in the minimal supersymmetric standard model (MSSM) decays invisibly into neutralinos in the light of the recent discovery of the 126 GeV resonance at the CERN Large Hadron Collider (LHC). For this purpose we consider the minimal supersymmetric standard model with universal, nonuniversal and arbitrary boundary conditions on the supersymmetry breaking gaugino mass parameters at the grand unified scale. Typically, scenarios with universal and nonuniversal gaugino masses do not allow invisible decays of the lightest Higgs boson (h(0)), which is identified with the 126 GeV resonance, into the lightest neutralinos in the MSSM. With arbitrary gaugino masses at the grand unified scale, such an invisible decay is possible. The second lightest Higgs boson can decay into various invisible final states for a considerable region of the MSSM parameter space with arbitrary gaugino masses as well as with the gaugino masses restricted by universal and nonuniversal boundary conditions at the grand unified scale. The possibility of the second lightest Higgs boson of the MSSM decaying into invisible channels is more likely for arbitrary gaugino masses at the grand unified scale. The heavier Higgs boson decay into lighter particles leads to the intriguing possibility that the entire Higgs boson spectrum of the MSSM may be visible at the LHC even if it decays invisibly, during the searches for an extended Higgs boson sector at the LHC. In such a scenario the nonobservation of the extended Higgs sector of the MSSM may carefully be used to rule out regions of the MSSM parameter space at the LHC.
Resumo:
Resumen: El presente trabajo intenta encontrar una causa exógena al deterioro, a partir de 2005, en los estándares de crédito hipotecario que contribuyeron a la crisis subprime en los Estados Unidos. Sostenemos que la nueva provisión de la prueba de medios de la ley Bankruptcy Abuse Prevention and Consumer Protection Act (BAPCPA) de 2005 fue dicho shock exógeno en el mercado hipotecario. Mostramos que la prueba de medios, que impide solicitar la bancarrota bajo Chapter 7 a los deudores con mayores ingresos relativos, causó un desplazamiento de la oferta de crédito hipotecario de deudores con mayores ingresos a deudores con menores ingresos relativos. Simultáneamente, observamos que todos los deudores debieron pagar tasas de interés más altas, independientemente del nivel de ingresos. Nuestros resultados implican que la ley BAPCPA podría ser un factor que contribuyó al deterioro en los estándares de crédito en el mercado hipotecario de los Estados Unidos.