915 resultados para state-space methods


Relevância:

40.00% 40.00%

Publicador:

Resumo:

As James Scott’s Seeing Like a State attests, forests played a central role in the rise of the modern state, specifically as test spaces for evolving methods of managing state resources at a distance, and as the location for grand state schemes. Together, such ambitions necessitated both the elimination of local understandings of forest management – to be replaced by centrally controlled scientific precision – and a narrowing of state vision. Forests thus began to be conflated with trees (and their timber) alone. All other aspects of the forest, both human and non-human, were ignored. Through the lens of the 18th and early 19th century New Forest in southern England, this paper examines the impact of government attempts to shift the focus of state forests from being remnant medieval hunting spaces to spaces of income generation through the creation of vast sylvicultural plantations. This state scheme not only reworked the relationship between the metropole and the provinces – something effected through systematic surveys and novel bureaucratic procedures – but also dramatically impacted upon the biophysical and cultural geographies of the forest. By equating forest space with trees alone, the British state failed to legislate for the actions of both local commoners and non-human others in resisting their schemes. Indeed, subsequent oppositions proved not only the tenacity of commoners in protecting their livelihoods but also the destructive power of non-human actants, specifically rabbits and mice. The paper concludes that grand state schemes necessarily fail due to their own internal illogic: the narrowing of state vision creates blind spots in which human and non-human lives assert their own visions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a hybrid approach to the experimental assessment of the genuine quantum features of a general system consisting of microscopic and macroscopic parts. We infer entanglement by combining dichotomic measurements on a bidimensional system and phase-space inference through the Wigner distribution associated with the macroscopic component of the state. As a benchmark, we investigate the feasibility of our proposal in a bipartite-entangled state composed of a single-photon and a multiphoton field. Our analysis shows that, under ideal conditions, maximal violation of a Clauser-Horne-Shimony-Holt-based inequality is achievable regardless of the number of photons in the macroscopic part of the state. The difficulty in observing entanglement when losses and detection inefficiency are included can be overcome by using a hybrid entanglement witness that allows efficient correction for losses in the few-photon regime.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Stationary solutions to the equations of nonlinear diffusive shock acceleration play a fundamental role in the theory of cosmic-ray acceleration. Their existence usually requires that a fraction of the accelerated particles be allowed to escape from the system. Because the scattering mean free path is thought to be an increasing function of energy, this condition is conventionally implemented as an upper cutoff in energy space-particles are then permitted to escape from any part of the system, once their energy exceeds this limit. However, because accelerated particles are responsible for the substantial amplification of the ambient magnetic field in a region upstream of the shock front, we examine an alternative approach in which particles escape over a spatial boundary. We use a simple iterative scheme that constructs stationary numerical solutions to the coupled kinetic and hydrodynamic equations. For parameters appropriate for supernova remnants, we find stationary solutions with efficient acceleration when the escape boundary is placed at the point where growth and advection of strongly driven nonresonant waves are in balance. We also present the energy dependence of the distribution function close to the energy where it cuts off-a diagnostic that is in principle accessible to observation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ion acceleration driven by high intensity laser pulses is attracting an impressive and steadily increasing research effort. Experiments over the past 10-15 years have demonstrated, over a wide range of laser and target parameters, the generation of multi-MeV proton and ion beams with unique properties, which have stimulated interest in a number of innovative applications. While most of this work has been based on sheath acceleration processes, where space-charge fields are established by relativistic electrons at surfaces of the irradiated target, a number of novel mechanisms has been the focus of recent theoretical and experimental activities. This paper will provide a brief review of the state of the art in the field of laser-driven ion acceleration, with particular attention to recent developments.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The vibrational energy levels of diazocarbene (diazomethylene) in its electronic ground state, (X) over tilde (3) Sigma(-) CNN, have been predicted using the variational method. The potential energy surfaces of (X) over tilde (3) A" CNN were determined by employing ab initio single reference coupled cluster with single and double excitations (CCSD), CCSD with perturbative triple excitations [CCSD(T)], multi-reference complete active space self-consistent-field (CASSCF), and internally contracted multi-reference configuration interaction (ICMRCI) methods. The correlation-consistent polarised valence quadruple zeta (cc-pVQZ) basis set was used. Four sets of vibrational energy levels determined from the four distinct analytical potential functions have been compared with the experimental values from the laser-induced fluorescence measurements of Wurfel et al. obtained in 1992. The CCSD, CCSD(T), and CASSCF potentials have not provided satisfactory agreement with the experimental observations. In this light, the importance of both non-dynamic (static) and dynamic correlation effects in describing the ground state of CNN is emphasised. Our best theoretical fundamental frequencies at the cc-pVQZ ICMRCI level of theory, v(1) = 1230, v(2) = 394, and v(3) = 1420 cm(-1) are in excellent agreement with the experimental values of v(1) = 1235, v(2) = 396, and v(3) = 1419cm(-1) and the mean absolute deviation between the 23 calculated and experimental vibrational energy levels is only 7.4 cm(-1). It is shown that the previously suggested observation of the v(3) frequency at about 2847cm(-1) was in fact the first overtone 2v(3).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper describes advances in ground-based thermodynamic profiling of the lower troposphere through sensor synergy. The well-documented integrated profiling technique (IPT), which uses a microwave profiler, a cloud radar, and a ceilometer to simultaneously retrieve vertical profiles of temperature, humidity, and liquid water content (LWC) of nonprecipitating clouds, is further developed toward an enhanced performance in the boundary layer and lower troposphere. For a more accurate temperature profile, this is accomplished by including an elevation scanning measurement modus of the microwave profiler. Height-dependent RMS accuracies of temperature (humidity) ranging from 0.3 to 0.9 K (0.5–0.8 g m−3) in the boundary layer are derived from retrieval simulations and confirmed experimentally with measurements at distinct heights taken during the 2005 International Lindenberg Campaign for Assessment of Humidity and Cloud Profiling Systems and its Impact on High-Resolution Modeling (LAUNCH) of the German Weather Service. Temperature inversions, especially of the lower boundary layer, are captured in a very satisfactory way by using the elevation scanning mode. To improve the quality of liquid water content measurements in clouds the authors incorporate a sophisticated target classification scheme developed within the European cloud observing network CloudNet. It allows the detailed discrimination between different types of backscatterers detected by cloud radar and ceilometer. Finally, to allow IPT application also to drizzling cases, an LWC profiling method is integrated. This technique classifies the detected hydrometeors into three different size classes using certain thresholds determined by radar reflectivity and/or ceilometer extinction profiles. By inclusion into IPT, the retrieved profiles are made consistent with the measurements of the microwave profiler and an LWC a priori profile. Results of IPT application to 13 days of the LAUNCH campaign are analyzed, and the importance of integrated profiling for model evaluation is underlined.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Attempts to estimate photosynthetic rate or gross primary productivity from remotely sensed absorbed solar radiation depend on knowledge of the light use efficiency (LUE). Early models assumed LUE to be constant, but now most researchers try to adjust it for variations in temperature and moisture stress. However, more exact methods are now required. Hyperspectral remote sensing offers the possibility of sensing the changes in the xanthophyll cycle, which is closely coupled to photosynthesis. Several studies have shown that an index (the photochemical reflectance index) based on the reflectance at 531 nm is strongly correlated with the LUE over hours, days and months. A second hyperspectral approach relies on the remote detection of fluorescence, which is a directly related to the efficiency of photosynthesis. We discuss the state of the art of the two approaches. Both have been demonstrated to be effective, but we specify seven conditions required before the methods can become operational.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.