898 resultados para State-space
Resumo:
L’obiettivo del lavoro esposto nella seguente relazione di tesi ha riguardato lo studio e la simulazione di esperimenti di radar bistatico per missioni di esplorazione planeteria. In particolare, il lavoro si è concentrato sull’uso ed il miglioramento di un simulatore software già realizzato da un consorzio di aziende ed enti di ricerca nell’ambito di uno studio dell’Agenzia Spaziale Europea (European Space Agency – ESA) finanziato nel 2008, e svolto fra il 2009 e 2010. L’azienda spagnola GMV ha coordinato lo studio, al quale presero parte anche gruppi di ricerca dell’Università di Roma “Sapienza” e dell’Università di Bologna. Il lavoro svolto si è incentrato sulla determinazione della causa di alcune inconsistenze negli output relativi alla parte del simulatore, progettato in ambiente MATLAB, finalizzato alla stima delle caratteristiche della superficie di Titano, in particolare la costante dielettrica e la rugosità media della superficie, mediante un esperimento con radar bistatico in modalità downlink eseguito dalla sonda Cassini-Huygens in orbita intorno al Titano stesso. Esperimenti con radar bistatico per lo studio di corpi celesti sono presenti nella storia dell’esplorazione spaziale fin dagli anni ’60, anche se ogni volta le apparecchiature utilizzate e le fasi di missione, durante le quali questi esperimenti erano effettuati, non sono state mai appositamente progettate per lo scopo. Da qui la necessità di progettare un simulatore per studiare varie possibili modalità di esperimenti con radar bistatico in diversi tipi di missione. In una prima fase di approccio al simulatore, il lavoro si è incentrato sullo studio della documentazione in allegato al codice così da avere un’idea generale della sua struttura e funzionamento. È seguita poi una fase di studio dettagliato, determinando lo scopo di ogni linea di codice utilizzata, nonché la verifica in letteratura delle formule e dei modelli utilizzati per la determinazione di diversi parametri. In una seconda fase il lavoro ha previsto l’intervento diretto sul codice con una serie di indagini volte a determinarne la coerenza e l’attendibilità dei risultati. Ogni indagine ha previsto una diminuzione delle ipotesi semplificative imposte al modello utilizzato in modo tale da identificare con maggiore sicurezza la parte del codice responsabile dell’inesattezza degli output del simulatore. I risultati ottenuti hanno permesso la correzione di alcune parti del codice e la determinazione della principale fonte di errore sugli output, circoscrivendo l’oggetto di studio per future indagini mirate.
Resumo:
The US penitentiary at Lewisburg, Pennsylvania, was retrofitted in 2008 to offer the country’s first federal Special Management Unit (SMU) program of its kind. This model SMU is designed for federal inmates from around the country identified as the most intractably troublesome, and features double-celling of inmates in tiny spaces, in 23-hour or 24-hour a day lockdown, requiring them to pass through a two-year program of readjustment. These spatial tactics, and the philosophy of punishment underlying them, contrast with the modern reform ideals upon which the prison was designed and built in 1932. The SMU represents the latest punitive phase in American penology, one that neither simply eliminates men as in the premodern spectacle, nor creates the docile, rehabilitated bodies of the modern panopticon; rather, it is a late-modern structure that produces only fear, terror, violence, and death. This SMU represents the latest of the late-modern prisons, similar to other supermax facilities in the US but offering its own unique system of punishment as well. While the prison exists within the system of American law and jurisprudence, it also manifests features of Agamben’s lawless, camp-like space that emerges during a state of exception, exempt from outside scrutiny with inmate treatment typically beyond the scope of the law.
Resumo:
The US penitentiary at Lewisburg, Pennsylvania, was retrofitted in 2008 to offer the country's first federal Special Management Unit (SMU) program of its kind. This model SMU is designed for federal inmates from around the country identified as the most intractably troublesome, and features double-celling of inmates in tiny spaces, in 23-hour or 24-hour a day lockdown, requiring them to pass through a two-year program of readjustment. These spatial tactics, and the philosophy of punishment underlying them, contrast with the modern reform ideals upon which the prison was designed and built in 1932. The SMU represents the latest punitive phase in American penology, one that neither simply eliminates men as in the premodern spectacle, nor creates the docile, rehabilitated bodies of the modern panopticon; rather, it is a late-modern structure that produces only fear, terror, violence, and death. This SMU represents the latest of the late-modern prisons, similar to other supermax facilities in the US but offering its own unique system of punishment as well. While the prison exists within the system of American law and jurisprudence, it also manifests features of Agamben's lawless, camp-like space that emerges during a state of exception, exempt from outside scrutiny with inmate treatment typically beyond the scope of the law
Resumo:
Model-based calibration of steady-state engine operation is commonly performed with highly parameterized empirical models that are accurate but not very robust, particularly when predicting highly nonlinear responses such as diesel smoke emissions. To address this problem, and to boost the accuracy of more robust non-parametric methods to the same level, GT-Power was used to transform the empirical model input space into multiple input spaces that simplified the input-output relationship and improved the accuracy and robustness of smoke predictions made by three commonly used empirical modeling methods: Multivariate Regression, Neural Networks and the k-Nearest Neighbor method. The availability of multiple input spaces allowed the development of two committee techniques: a 'Simple Committee' technique that used averaged predictions from a set of 10 pre-selected input spaces chosen by the training data and the "Minimum Variance Committee" technique where the input spaces for each prediction were chosen on the basis of disagreement between the three modeling methods. This latter technique equalized the performance of the three modeling methods. The successively increasing improvements resulting from the use of a single best transformed input space (Best Combination Technique), Simple Committee Technique and Minimum Variance Committee Technique were verified with hypothesis testing. The transformed input spaces were also shown to improve outlier detection and to improve k-Nearest Neighbor performance when predicting dynamic emissions with steady-state training data. An unexpected finding was that the benefits of input space transformation were unaffected by changes in the hardware or the calibration of the underlying GT-Power model.
Resumo:
Nearly 22 million Americans operate as shift workers, and shift work has been linked to the development of cardiovascular disease (CVD). This study is aimed at identifying pivotal risk factors of CVD by assessing 24 hour ambulatory blood pressure, state anxiety levels and sleep patterns in 12 hour fixed shift workers. We hypothesized that night shift work would negatively affect blood pressure regulation, anxiety levels and sleep patterns. A total of 28 subjects (ages 22-60) were divided into two groups: 12 hour fixed night shift workers (n=15) and 12 hour fixed day shift workers (n=13). 24 hour ambulatory blood pressure measurements (Space Labs 90207) were taken twice: once during a regular work day and once on a non-work day. State anxiety levels were assessed on both test days using the Speilberger’s State Trait Anxiety Inventory. Total sleep time (TST) was determined using self recorded sleep diary. Night shift workers demonstrated increases in 24 hour systolic (122 ± 2 to 126 ± 2 mmHg, P=0.012); diastolic (75 ± 1 to 79 ± 2 mmHg, P=0.001); and mean arterial pressures (90 ± 2 to 94 ± 2mmHg, P<0.001) during work days compared to off days. In contrast, 24 hour blood pressures were similar during work and off days in day shift workers. Night shift workers reported less TST on work days versus off days (345 ± 16 vs. 552 ± 30 min; P<0.001), whereas day shift workers reported similar TST during work and off days (475 ± 16 minutes to 437 ± 20 minutes; P=0.231). State anxiety scores did not differ between the groups or testing days (time*group interaction P=0.248), suggesting increased 24 hour blood pressure during night shift work is related to decreased TST, not short term anxiety. Our findings suggest that fixed night shift work causes disruption of the normal sleep-wake cycle negatively affecting acute blood pressure regulation, which may increase the long-term risk for CVD.
Resumo:
During the past years, Brazil has been mentioned internationally as a one of the so-called BRICs (Brazil, Russia, India and China). These countries have been taking increasing space in the economical and political global scenarios in the XXI century. The facts that they possess a vast territory and stand among the highest populated countries increase their relevance within the United Nations. Besides, three of them constitute nuclear powers and two of them belong to the United Nations Security Council. Brazil has significantly participated in forums such as WTO and UNO, representing central political articulation and stability to Latin America and in the structuring and growth of MERCOSUL (Brazil, Argentina, Uruguay, Paraguay and Venezuela). Once again among the ten greatest economies of the world, the country has launched ambitious poverty-fighting programs helping more than 20 million people in the last years, such as the “Bolsa Família” (Familienstipendium) Program or and its complements). Nevertheless, Latin American countries are far from generating structural funds as the “European Social Fund” to assist specific demands of big cities as Sao Paulo and Buenos Aires. The commitments are restricted to commercial areas and bring nothing but slow and scarce advances to education or infra-structure and to the integration of systems related to these areas.
Resumo:
ContentsFinding amusement in classCareer fair presents jobs to ISU studentsISU signs 21 new recruits for next seasonWhy do we fight greed?50 bands, 15 hours, one Space
Resumo:
Species coexistence has been a fundamental issue to understand ecosystem functioning since the beginnings of ecology as a science. The search of a reliable and all-encompassing explanation for this issue has become a complex goal with several apparently opposing trends. On the other side, seemingly unconnected with species coexistence, an ecological state equation based on the inverse correlation between an indicator of dispersal that fits gamma distribution and species diversity has been recently developed. This article explores two factors, whose effects are inconspicuous in such an equation at the first sight, that are used to develop an alternative general theoretical background in order to provide a better understanding of species coexistence. Our main outcomes are: (i) the fit of dispersal and diversity values to gamma distribution is an important factor that promotes species coexistence mainly due to the right-skewed character of gamma distribution; (ii) the opposite correlation between species diversity and dispersal implies that any increase of diversity is equivalent to a route of “ecological cooling” whose maximum limit should be constrained by the influence of the third law of thermodynamics; this is in agreement with the well-known asymptotic trend of diversity values in space and time; (iii) there are plausible empirical and theoretical ways to apply physical principles to explain important ecological processes; (iv) the gap between theoretical and empirical ecology in those cases where species diversity is paradoxically high could be narrowed by a wave model of species coexistence based on the concurrency of local equilibrium states. In such a model, competitive exclusion has a limited but indispensable role in harmonious coexistence with functional redundancy. We analyze several literature references as well as ecological and evolutionary examples that support our approach, reinforcing the meaning equivalence between important physical and ecological principles.
Resumo:
Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.
Resumo:
This paper reviews developments in our understanding of the state of the Antarctic and Southern Ocean climate and its relation to the global climate system over the last few millennia. Climate over this and earlier periods has not been stable, as evidenced by the occurrence of abrupt changes in atmospheric circulation and temperature recorded in Antarctic ice core proxies for past climate. Two of the most prominent abrupt climate change events are characterized by intensification of the circumpolar westerlies (also known as the Southern Annular Mode) between similar to 6000 and 5000 years ago and since 1200-1000 years ago. Following the last of these is a period of major trans-Antarctic reorganization of atmospheric circulation and temperature between A. D. 1700 and 1850. The two earlier Antarctic abrupt climate change events appear linked to but predate by several centuries even more abrupt climate change in the North Atlantic, and the end of the more recent event is coincident with reorganization of atmospheric circulation in the North Pacific. Improved understanding of such events and of the associations between abrupt climate change events recorded in both hemispheres is critical to predicting the impact and timing of future abrupt climate change events potentially forced by anthropogenic changes in greenhouse gases and aerosols. Special attention is given to the climate of the past 200 years, which was recorded by a network of recently available shallow firn cores, and to that of the past 50 years, which was monitored by the continuous instrumental record. Significant regional climate changes have taken place in the Antarctic during the past 50 years. Atmospheric temperatures have increased markedly over the Antarctic Peninsula, linked to nearby ocean warming and intensification of the circumpolar westerlies. Glaciers are retreating on the peninsula, in Patagonia, on the sub-Antarctic islands, and in West Antarctica adjacent to the peninsula. The penetration of marine air masses has become more pronounced over parts of West Antarctica. Above the surface, the Antarctic troposphere has warmed during winter while the stratosphere has cooled year-round. The upper kilometer of the circumpolar Southern Ocean has warmed, Antarctic Bottom Water across a wide sector off East Antarctica has freshened, and the densest bottom water in the Weddell Sea has warmed. In contrast to these regional climate changes, over most of Antarctica, near-surface temperature and snowfall have not increased significantly during at least the past 50 years, and proxy data suggest that the atmospheric circulation over the interior has remained in a similar state for at least the past 200 years. Furthermore, the total sea ice cover around Antarctica has exhibited no significant overall change since reliable satellite monitoring began in the late 1970s, despite large but compensating regional changes. The inhomogeneity of Antarctic climate in space and time implies that recent Antarctic climate changes are due on the one hand to a combination of strong multidecadal variability and anthropogenic effects and, as demonstrated by the paleoclimate record, on the other hand to multidecadal to millennial scale and longer natural variability forced through changes in orbital insolation, greenhouse gases, solar variability, ice dynamics, and aerosols. Model projections suggest that over the 21st century the Antarctic interior will warm by 3.4 degrees +/- 1 degrees C, and sea ice extent will decrease by similar to 30%. Ice sheet models are not yet adequate enough to answer pressing questins about the effect of projected warming on mass balance and sea level. Considering the potentially major impacts of a warming climate on Antarctica, vigorous efforts are needed to better understand all aspects of the highly coupled Antarctic climate system as well as its influence on the Earth's climate and oceans.
Resumo:
The preparations, X-ray structures, and magnetic characterizations are presented for two new pentadecanuclear cluster compounds: [NiII{NiII(MeOH)3}8(μ-CN)30{MV(CN)3}6]·xMeOH·yH2O (MV = MoV (1) with x = 17, y = 1; MV = WV (2) with x = 15, y = 0). Both compounds crystallize in the monoclinic space group C2/c, with cell dimensions of a = 28.4957(18) Å, b = 19.2583(10) Å, c = 32.4279(17) Å, β = 113.155(6)°, and Z = 4 for 1 and a = 28.5278(16) Å, b = 19.2008(18) Å, c = 32.4072(17) Å, β = 113.727(6)°, and Z = 4 for 2. The structures of 1 and 2 consist of neutral cluster complexes comprising 15 metal ions, 9 NiII and 6 MV, all linked by μ-cyano ligands. Magnetic susceptibilities and magnetization measurements of compounds 1 and 2 in the crystalline and dissolved state indicate that these clusters have a S = 12 ground state, originating from intracluster ferromagnetic exchange interactions between the μ-cyano-bridged metal ions of the type NiII−NC−MV. Indeed, these data show clearly that the cluster molecules stay intact in solution. Ac magnetic susceptibility measurements reveal that the cluster compounds exhibit magnetic susceptibility relaxation phenomena at low temperatures since, with nonzero dc fields, χ‘ ‘M has a nonzero value that is frequency dependent. However, there appears no out-of-phase (χ‘ ‘M) signal in zero dc field down to 1.8 K, which excludes the expected signature for a single molecule magnet. This finding is confirmed with the small uniaxial magnetic anisotropy value for D of 0.015 cm-1, deduced from the high-field, high-frequency EPR measurement, which distinctly reveals a positive sign in D. Obviously, the overall magnetic anisotropy of the compounds is too low, and this may be a consequence of a small single ion magnetic anisotropy combined with the highly symmetric arrangement of the metal ions in the cluster molecule.
Resumo:
The currently proposed space debris remediation measures include the active removal of large objects and “just in time” collision avoidance by deviating the objects using, e.g., ground-based lasers. Both techniques require precise knowledge of the attitude state and state changes of the target objects. In the former case, to devise methods to grapple the target by a tug spacecraft, in the latter, to precisely propagate the orbits of potential collision partners as disturbing forces like air drag and solar radiation pressure depend on the attitude of the objects. Non-resolving optical observations of the magnitude variations, so-called light curves, are a promising technique to determine rotation or tumbling rates and the orientations of the actual rotation axis of objects, as well as their temporal changes. The 1-meter telescope ZIMLAT of the Astronomical Institute of the University of Bern has been used to collect light curves of MEO and GEO objects for a considerable period of time. Recently, light curves of Low Earth Orbit (LEO) targets were acquired as well. We present different observation methods, including active tracking using a CCD subframe readout technique, and the use of a high-speed scientific CMOS camera. Technical challenges when tracking objects with poor orbit redictions, as well as different data reduction methods are addressed. Results from a survey of abandoned rocket upper stages in LEO, examples of abandoned payloads and observations of high area-to-mass ratio debris will be resented. Eventually, first results of the analysis of these light curves are provided.
Resumo:
Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.
Resumo:
We investigate the consequences of one extra spatial dimension for the stability and energy spectrum of the non-relativistic hydrogen atom with a potential defined by Gauss' law, i.e. proportional to 1 /| x | 2 . The additional spatial dimension is considered to be either infinite or curled-up in a circle of radius R. In both cases, the energy spectrum is bounded from below for charges smaller than the same critical value and unbounded from below otherwise. As a consequence of compactification, negative energy eigenstates appear: if R is smaller than a quarter of the Bohr radius, the corresponding Hamiltonian possesses an infinite number of bound states with minimal energy extending at least to the ground state of the hydrogen atom.
Resumo:
BACKGROUND During threat, interpersonal distance is deliberately increased. Personal space regulation is related to amygdala function and altered in schizophrenia, but it remains unknown whether it is particularly associated with paranoid threat. METHODS We compared performance in two tests on personal space between 64 patients with schizophrenia spectrum disorders and 24 matched controls. Patients were stratified in those with paranoid threat, neutral affect or paranoid experience of power. In the stop-distance paradigm, participants indicated the minimum tolerable interpersonal distance. In the fixed-distance paradigm, they indicated the level of comfort at fixed interpersonal distances. RESULTS Paranoid threat increased interpersonal distance two-fold in the stop-distance paradigm, and reduced comfort ratings in the fixed-distance paradigm. In contrast, patients experiencing paranoid power had high comfort ratings at any distance. Patients with neutral affect did not differ from controls in the stop-distance paradigm. Differences between groups remained when controlling for gender and positive symptom severity. Among schizophrenia patients, the stop-distance paradigm detected paranoid threat with 93% sensitivity and 83% specificity. CONCLUSIONS Personal space regulation is not generally altered in schizophrenia. However, state paranoid experience has distinct contributions to personal space regulation. Subjects experiencing current paranoid threat share increased safety-seeking behavior.