997 resultados para Fan-Complete Space
Resumo:
We show that the solution published in the paper by Senovilla [Phys. Rev. Lett. 64, 2219 (1990)] is geodesically complete and singularity-free. We also prove that the solution satisfies the stronger energy and causality conditions, such as global hyperbolicity, the strong energy condition, causal symmetry, and causal stability. A detailed discussion about which assumptions in the singularity theorems are not satisfied is performed, and we show explicitly that the solution is in accordance with those theorems. A brief discussion of the results is given.
Resumo:
Sustainable resource use is one of the most important environmental issues of our times. It is closely related to discussions on the 'peaking' of various natural resources serving as energy sources, agricultural nutrients, or metals indispensable in high-technology applications. Although the peaking theory remains controversial, it is commonly recognized that a more sustainable use of resources would alleviate negative environmental impacts related to resource use. In this thesis, sustainable resource use is analysed from a practical standpoint, through several different case studies. Four of these case studies relate to resource metabolism in the Canton of Geneva in Switzerland: the aim was to model the evolution of chosen resource stocks and flows in the coming decades. The studied resources were copper (a bulk metal), phosphorus (a vital agricultural nutrient), and wood (a renewable resource). In addition, the case of lithium (a critical metal) was analysed briefly in a qualitative manner and in an electric mobility perspective. In addition to the Geneva case studies, this thesis includes a case study on the sustainability of space life support systems. Space life support systems are systems whose aim is to provide the crew of a spacecraft with the necessary metabolic consumables over the course of a mission. Sustainability was again analysed from a resource use perspective. In this case study, the functioning of two different types of life support systems, ARES and BIORAT, were evaluated and compared; these systems represent, respectively, physico-chemical and biological life support systems. Space life support systems could in fact be used as a kind of 'laboratory of sustainability' given that they represent closed and relatively simple systems compared to complex and open terrestrial systems such as the Canton of Geneva. The chosen analysis method used in the Geneva case studies was dynamic material flow analysis: dynamic material flow models were constructed for the resources copper, phosphorus, and wood. Besides a baseline scenario, various alternative scenarios (notably involving increased recycling) were also examined. In the case of space life support systems, the methodology of material flow analysis was also employed, but as the data available on the dynamic behaviour of the systems was insufficient, only static simulations could be performed. The results of the case studies in the Canton of Geneva show the following: were resource use to follow population growth, resource consumption would be multiplied by nearly 1.2 by 2030 and by 1.5 by 2080. A complete transition to electric mobility would be expected to only slightly (+5%) increase the copper consumption per capita while the lithium demand in cars would increase 350 fold. For example, phosphorus imports could be decreased by recycling sewage sludge or human urine; however, the health and environmental impacts of these options have yet to be studied. Increasing the wood production in the Canton would not significantly decrease the dependence on wood imports as the Canton's production represents only 5% of total consumption. In the comparison of space life support systems ARES and BIORAT, BIORAT outperforms ARES in resource use but not in energy use. However, as the systems are dimensioned very differently, it remains questionable whether they can be compared outright. In conclusion, the use of dynamic material flow analysis can provide useful information for policy makers and strategic decision-making; however, uncertainty in reference data greatly influences the precision of the results. Space life support systems constitute an extreme case of resource-using systems; nevertheless, it is not clear how their example could be of immediate use to terrestrial systems.
Resumo:
With the aim of better understanding avalanche risk in the Catalan Pyrenees, the present work focuses on the analysis of major (or destructive) avalanches. For such purpose major avalanche cartography was made by an exhaustive photointerpretation of several flights, winter and summer field surveys and inquiries to local population. Major avalanche events were used to quantify the magnitude of the episodes during which they occurred, and a Major Avalanche Activity Magnitude Index (MAAMI) was developed. This index is based on the number of major avalanches registered and its estimated frequency in a given time period, hence it quantifies the magnitude of a major avalanche episode or winter. Furthermore, it permits a comparison of the magnitude between major avalanche episodes in a given mountain range, or between mountain ranges, and for a long enough period, it should allow analysis of temporal trends. Major episodes from winter 1995/96 to 2013/14 were reconstructed. Their magnitude, frequency and extent were also assessed. During the last 19 winters, the episodes of January 22-23 and February 6-8 in 1996 were those with highest MAAMI values,followed by January 30-31, 2003, January 29, 2006, and January 24-25, 2014. To analyze the whole twentieth century, a simplified MAAMI was defined in order to attain the same purpose with a less complete dataset. With less accuracy, the same parameters were obtained at winter time resolution throughout the twentieth century. Again, 1995/96 winter had the highest MAAMI value followed by 1971/72, 1974/75 and 1937/38 winter seasons. The analysis of the spatial extent of the different episodes allowed refining the demarcation of nivological regions, and improving our knowledge about the atmospheric patterns that cause major episodes and their climatic interpretation. In some cases, the importance of considering a major avalanche episode as the result of a previous preparatory period, followed by a triggering one was revealed.
Resumo:
Some properties of generalized canonical systems - special dynamical systems described by a Hamiltonian function linear in the adjoint variables - are applied in determining the solution of the two-dimensional coast-arc problem in an inverse-square gravity field. A complete closed-form solution for Lagrangian multipliers - adjoint variables - is obtained by means of such properties for elliptic, circular, parabolic and hyperbolic motions. Classic orbital elements are taken as constants of integration of this solution in the case of elliptic, parabolic and hyperbolic motions. For circular motion, a set of nonsingular orbital elements is introduced as constants of integration in order to eliminate the singularity of the solution.
Resumo:
Biofuels for transport are a renewable source of energy that were once heralded as a solution to multiple problems associated with poor urban air quality, the overproduction of agricultural commodities, the energy security of the European Union (EU) and climate change. It was only after the Union had implemented an incentivizing framework of legal and political instruments for the production, trade and consumption of biofuels that the problems of weakening food security, environmental degradation and increasing greenhouse gases through land-use changes began to unfold. In other words, the difference between political aims for why biofuels are promoted and their consequences has grown – which is also recognized by the EU policy-makers. Therefore, the global networks of producing, trading and consuming biofuels may face a complete restructure if the European Commission accomplishes its pursuit to sideline crop-based biofuels after 2020. My aim with this dissertation is not only to trace the manifold evolutions of the instruments used by the Union to govern biofuels but also to reveal how this evolution has influenced the dynamics of biofuel development. Therefore, I study the ways the EU’s legal and political instruments of steering biofuels are coconstitutive with the globalized spaces of biofuel development. My analytical strategy can be outlined through three concepts. I use the term ‘assemblage’ to approach the operations of the loose entity of actors and non-human elements that are the constituents of multi-scalar and -sectorial biofuel development. ‘Topology’ refers to the spatiality of this European biofuel assemblage and its parts whose evolving relations are treated as the active constituents of space, instead of simply being located in space. I apply the concept of ‘nomosphere’ to characterize the framework of policies, laws and other instruments that the EU applies and construes while attempting to govern biofuels. Even though both the materials and methods vary in the independent articles, these three concepts characterize my analytical strategy that allows me to study law, policy and space associated with each other. The results of my examinations underscore the importance of the instruments of governance of the EU constituting and stabilizing the spaces of producing and, on the other hand, how topological ruptures in biofuel development have enforced the need to reform policies. This analysis maps the vast scope of actors that are influenced by the mechanism of EU biofuel governance and, what is more, shows how they are actively engaging in the Union’s institutional policy formulation. By examining the consequences of fast biofuel development that are spatially dislocated from the established spaces of producing, trading and consuming biofuels such as indirect land use changes, I unfold the processes not tackled by the instruments of the EU. Indeed, it is these spatially dislocated processes that have pushed the Commission construing a new type of governing biofuels: transferring the instruments of climate change mitigation to land-use policies. Although efficient in mitigating these dislocated consequences, these instruments have also created peculiar ontological scaffolding for governing biofuels. According to this mode of governance, the spatiality of biofuel development appears to be already determined and the agency that could dampen the negative consequences originating from land-use practices is treated as irrelevant.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
Infrared multilayer interference filters have been used extensively in satellite radiometers for about 15 years. Filters manufactured by the University of Reading have been used in Nimbus 5, 6, and 7, TIROS N, and the Pioneer Venus orbiter. The ability of the filters to withstand the space environment in these applications is critical; if degradation takes place, the effects would range from worsening of signal-to-noise performance to complete system failure. An experiment on the LDEF will enable the filters, for the first time, to be subjected to authoritative spectral measurements following space exposure to ascertain their suitability for spacecraft use and to permit an understanding of degradation mechanisms.
Resumo:
*** Purpose – Computer tomography (CT) for 3D reconstruction entails a huge number of coplanar fan-beam projections for each of a large number of 2D slice images, and excessive radiation intensities and dosages. For some applications its rate of throughput is also inadequate. A technique for overcoming these limitations is outlined. *** Design methodology/approach – A novel method to reconstruct 3D surface models of objects is presented, using, typically, ten, 2D projective images. These images are generated by relative motion between this set of objects and a set of ten fanbeam X-ray sources and sensors, with their viewing axes suitably distributed in 2D angular space. *** Findings – The method entails a radiation dosage several orders of magnitude lower than CT, and requires far less computational power. Experimental results are given to illustrate the capability of the technique *** Practical implications – The substantially lower cost of the method and, more particularly, its dramatically lower irradiation make it relevant to many applications precluded by current techniques *** Originality/value – The method can be used in many applications such as aircraft hold-luggage screening, 3D industrial modelling and measurement, and it should also have important applications to medical diagnosis and surgery.
Resumo:
Models of the City of London office market are extended by considering a longer time series of data, covering two cycles, and by explicit modeling of asymmetric rental response to supply and demand model. A long run structural model linking demand for office space, real rental levels and office-based employment is estimated and then rental adjustment processes are modeled using an error correction model framework. Adjustment processes are seen to be asymmetric, dependent both on the direction of the supply and demand shock and on the state of the rental market at the time of the shock. A complete system of equations is estimated: unit shocks produce oscillations but there is a return to a steady equilibrium state in the long run.
Resumo:
Operator spaces of Hilbertian JC∗ -triples E are considered in the light of the universal ternary ring of operators (TRO) introduced in recent work. For these operator spaces, it is shown that their triple envelope (in the sense of Hamana) is the TRO they generate, that a complete isometry between any two of them is always the restriction of a TRO isomorphism and that distinct operator space structures on a fixed E are never completely isometric. In the infinite-dimensional cases, operator space structure is shown to be characterized by severe and definite restrictions upon finite-dimensional subspaces. Injective envelopes are explicitly computed.
Resumo:
Sea surface temperature (SST) datasets have been generated from satellite observations for the period 1991–2010, intended for use in climate science applications. Attributes of the datasets specifically relevant to climate applications are: first, independence from in situ observations; second, effort to ensure homogeneity and stability through the time-series; third, context-specific uncertainty estimates attached to each SST value; and, fourth, provision of estimates of both skin SST (the fundamental measure- ment, relevant to air-sea fluxes) and SST at standard depth and local time (partly model mediated, enabling comparison with his- torical in situ datasets). These attributes in part reflect requirements solicited from climate data users prior to and during the project. Datasets consisting of SSTs on satellite swaths are derived from the Along-Track Scanning Radiometers (ATSRs) and Advanced Very High Resolution Radiometers (AVHRRs). These are then used as sole SST inputs to a daily, spatially complete, analysis SST product, with a latitude-longitude resolution of 0.05°C and good discrimination of ocean surface thermal features. A product user guide is available, linking to reports describing the datasets’ algorithmic basis, validation results, format, uncer- tainty information and experimental use in trial climate applications. Future versions of the datasets will span at least 1982–2015, better addressing the need in many climate applications for stable records of global SST that are at least 30 years in length.
Resumo:
This article recovers and contextualizes the politics of British punk fanzines produced in the late 1970s and early 1980s. It argues that fanzines – and youth cultures more generally – provide a contested cultural space for young people to express their ideas, opinions and anxieties. Simultaneously, it maintains that punk fanzines offer the historian a portal into a period of significant socio-economic, political and cultural change. As well as presenting alternative cultural narratives to the formulaic accounts of punk and popular music now common in the mainstream media, fanzines allow us a glimpse of the often radical ideas held by a youthful milieu rarely given expression in the political arena.
Resumo:
In Information Visualization, adding and removing data elements can strongly impact the underlying visual space. We have developed an inherently incremental technique (incBoard) that maintains a coherent disposition of elements from a dynamic multidimensional data set on a 2D grid as the set changes. Here, we introduce a novel layout that uses pairwise similarity from grid neighbors, as defined in incBoard, to reposition elements on the visual space, free from constraints imposed by the grid. The board continues to be updated and can be displayed alongside the new space. As similar items are placed together, while dissimilar neighbors are moved apart, it supports users in the identification of clusters and subsets of related elements. Densely populated areas identified in the incSpace can be efficiently explored with the corresponding incBoard visualization, which is not susceptible to occlusion. The solution remains inherently incremental and maintains a coherent disposition of elements, even for fully renewed sets. The algorithm considers relative positions for the initial placement of elements, and raw dissimilarity to fine tune the visualization. It has low computational cost, with complexity depending only on the size of the currently viewed subset, V. Thus, a data set of size N can be sequentially displayed in O(N) time, reaching O(N (2)) only if the complete set is simultaneously displayed.
Resumo:
LetQ(4)( c) be a four-dimensional space form of constant curvature c. In this paper we show that the infimum of the absolute value of the Gauss-Kronecker curvature of a complete minimal hypersurface in Q(4)(c), c <= 0, whose Ricci curvature is bounded from below, is equal to zero. Further, we study the connected minimal hypersurfaces M(3) of a space form Q(4)( c) with constant Gauss-Kronecker curvature K. For the case c <= 0, we prove, by a local argument, that if K is constant, then K must be equal to zero. We also present a classification of complete minimal hypersurfaces of Q(4)( c) with K constant.
Resumo:
In this work we compared the estimates of the parameters of ARCH models using a complete Bayesian method and an empirical Bayesian method in which we adopted a non-informative prior distribution and informative prior distribution, respectively. We also considered a reparameterization of those models in order to map the space of the parameters into real space. This procedure permits choosing prior normal distributions for the transformed parameters. The posterior summaries were obtained using Monte Carlo Markov chain methods (MCMC). The methodology was evaluated by considering the Telebras series from the Brazilian financial market. The results show that the two methods are able to adjust ARCH models with different numbers of parameters. The empirical Bayesian method provided a more parsimonious model to the data and better adjustment than the complete Bayesian method.