992 resultados para Function space


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite's Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bayesian adaptive methods have been extensively used in psychophysics to estimate the point at which performance on a task attains arbitrary percentage levels, although the statistical properties of these estimators have never been assessed. We used simulation techniques to determine the small-sample properties of Bayesian estimators of arbitrary performance points, specifically addressing the issues of bias and precision as a function of the target percentage level. The study covered three major types of psychophysical task (yes-no detection, 2AFC discrimination and 2AFC detection) and explored the entire range of target performance levels allowed for by each task. Other factors included in the study were the form and parameters of the actual psychometric function Psi, the form and parameters of the model function M assumed in the Bayesian method, and the location of Psi within the parameter space. Our results indicate that Bayesian adaptive methods render unbiased estimators of any arbitrary point on psi only when M=Psi, and otherwise they yield bias whose magnitude can be considerable as the target level moves away from the midpoint of the range of Psi. The standard error of the estimator also increases as the target level approaches extreme values whether or not M=Psi. Contrary to widespread belief, neither the performance level at which bias is null nor that at which standard error is minimal can be predicted by the sweat factor. A closed-form expression nevertheless gives a reasonable fit to data describing the dependence of standard error on number of trials and target level, which allows determination of the number of trials that must be administered to obtain estimates with prescribed precision.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Light rainfall is the baseline input to the annual water budget in mountainous landscapes through the tropics and at mid-latitudes. In the Southern Appalachians, the contribution from light rainfall ranges from 50-60% during wet years to 80-90% during dry years, with convective activity and tropical cyclone input providing most of the interannual variability. The Southern Appalachians is a region characterized by rich biodiversity that is vulnerable to land use/land cover changes due to its proximity to a rapidly growing population. Persistent near surface moisture and associated microclimates observed in this region has been well documented since the colonization of the area in terms of species health, fire frequency, and overall biodiversity. The overarching objective of this research is to elucidate the microphysics of light rainfall and the dynamics of low level moisture in the inner region of the Southern Appalachians during the warm season, with a focus on orographically mediated processes. The overarching research hypothesis is that physical processes leading to and governing the life cycle of orographic fog, low level clouds, and precipitation, and their interactions, are strongly tied to landform, land cover, and the diurnal cycles of flow patterns, radiative forcing, and surface fluxes at the ridge-valley scale. The following science questions will be addressed specifically: 1) How do orographic clouds and fog affect the hydrometeorological regime from event to annual scale and as a function of terrain characteristics and land cover?; 2) What are the source areas, governing processes, and relevant time-scales of near surface moisture convergence patterns in the region?; and 3) What are the four dimensional microphysical and dynamical characteristics, including variability and controlling factors and processes, of fog and light rainfall? The research was conducted with two major components: 1) ground-based high-quality observations using multi-sensor platforms and 2) interpretive numerical modeling guided by the analysis of the in situ data collection. Findings illuminate a high level of spatial – down to the ridge scale - and temporal – from event to annual scale - heterogeneity in observations, and a significant impact on the hydrological regime as a result of seeder-feeder interactions among fog, low level clouds, and stratiform rainfall that enhance coalescence efficiency and lead to significantly higher rainfall rates at the land surface. Specifically, results show that enhancement of an event up to one order of magnitude in short-term accumulation can occur as a result of concurrent fog presence. Results also show that events are modulated strongly by terrain characteristics including elevation, slope, geometry, and land cover. These factors produce interactions between highly localized flows and gradients of temperature and moisture with larger scale circulations. Resulting observations of DSD and rainfall patterns are stratified by region and altitude and exhibit clear diurnal and seasonal cycles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2015, the Sydenham Street Revived pop-up park project (SSR) transformed Sydenham Street between Princess and Queen Streets into a temporary pedestrian-only public space. The goal of the project was to test out the idea of permanently pedestrianizing this street section. But what did this urban experiment ultimately prove? Using video footage, photographs, and observations recorded before and during the project, this report analyzes the use of the space in order to evaluate the claim that SSR created a successful public space and to make recommendations for a permanent public space on Sydenham Street. Two research methods were used: quantitative data collection, consisting of headcounts of both pedestrians and stationary users of the space; and a qualitative observational survey, based on the criteria for successful public spaces developed by the Project for Public Spaces. Data collection occurred two days one week prior to the project, and two days during the project, on days that were similar in terms of temperature and weather. The research revealed that the SSR did create a successful public space, although additional research is needed in order to determine how the space would function as a public place throughout different seasons, to study the street closure’s impact on surrounding residents and businesses, and to understand how private commercial activity would influence use. Recommendations for a permanent public space on Sydenham Street include considerations for flexible street design and a continuous, barrier-free surface; ensuring that there is an abundance of places to sit; making opportunities for public and community-created art; and to improve walkability by connecting the grid using a mid-block walkway.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After defining the “enunciative scheme” (sentence type) as a communicative unit, the imperative is characterized as a morphologized modality of appellative kind used when the following conditions occur: appellative meaning, 2nd person, future tense and absence of negation. In Spanish, any variation of any of these requirements determines that the subjunctive is used. We reject the idea that the imperative is a variant of subjunctive specialized in appellative function and that both modes share a desiderative morpheme. Working in this way means attributing to a morphological category of the verb a property that actually corresponds to the enunciative schemes (sentence types). We propose to integrate the imperative and subjunctive in the framework of what we call the “desiderative-appellative space”. This “space” brings together various grammatical or grammaticalized means based on the imperative and the subjunctive. Semantically, it is organized around a component of desirability (action appears as desirable) that, by varying several factors, configures a route that goes from a center (the imperative) to a periphery (the expression of desire).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With this article we pretend to contribute, in a really modest way, to the liberation of a tenacious image of our society: which operates as an ideological basis of a group of current socio-political pseudocritics, with great success and diffusion. For this we will undertake the exposition and the analysis of the development group of Naissance de la biopolitique in which Foucault accomplishes the critic of all that number of inflationary speeches that represent our society like a “mass society” and a “estatalized space”. Facing these vague and disproportionate forms of consideration, the foucaltian critic, in its exquisite attention to what happens nowadays, it should reveal how our societies function as systems that optimize the difference –radically nominalists-, in which it is produced, beyond any phantasmatic of the oppressor and invasive estate, a regretion of the legal-estate structures that articulate the socio-politic groups, in benefit of the reconstitution and the social tissue as a communitarian network, suitable for the dynamics of market competence that characterise our enterprise societies. That will open to a new idea of the critic, and to a displacement of its object and objectives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este estudo centra-se numa investigação sobre o conceito de trabalho em open space na nova sede no Porto da empresa Energias de Portugal - SA, tendo em conta a estratégia implementada e os resultados conseguidos. Para isso, dissecámos as premissas apresentadas aos trabalhadores na cerimónia de inauguração do novo espaço - “O Open Space opera como plataforma de comunicação e de partilha de informação” e “O Open Space responde às necessidades dos trabalhadores, criando ambientes de trabalho modernos e funcionais”. A fim de avaliarmos o campo empírico, construímos e enviámos em formato eletrónico para o e-mail de todos os trabalhadores da amostra um instrumento de medida que denominámos de open space (OS). As ilações retiradas estão baseadas nos resultados analisados e discutidos após processamento em SPSS - predictive analytics software and solutions e em reports gráficos. O open space da EDP Porto é um local moderno e funcional, privilegiado em relação à fluidez e partilha de informação, capaz de manifestar estratégias de negócio e de salientar aspetos da marca e da cultura da Empresa. A formação/informação sobre comportamentos e regras básicas a seguir na partilha de um mesmo espaço, as razões de negócio que levam a organização a mudar o espaço de trabalho, a par das vantagens que ambas as partes podem tirar do novo conceito, influencia positivamente ou negativamente a perceção da mudança e o estado emocional dos trabalhadores. O ruído, a temperatura ambiente, a concentração ou a privacidade, são alguns dos fatores que poderão variar com o layout e funcionam como condicionantes de uma maior ou menor satisfação ambiental. No entanto, existem sempre questões que permanecem pendentes e foi nesse contexto que deixámos algumas propostas para novas investigações num trabalho científico que nunca se esgota. / This study focuses on researching the concept of working in an open space in the new Oporto’s headquarters of the company Energias de Portugal - SA, given the strategy implemented and the results achieved. For this, we dissected the assumptions presented to workers at the inauguration ceremony of the new space - "The Open Space operates as a platform for communication and information sharing" and "The Open Space responds to the needs of workers, creating modern and functional workplaces". In order to evaluate the empirical side, we built and sent, in electronic format, an e-mail to all the workers of the sample with a measurement tool that we called the open space (OS). The conclusions are based on the results analyzed and discussed after being processed in SPSS - predictive analytics software and solutions and graphs in reports. The open space of the EDP Oporto is a modern and functional place, privileged in relation to fluidity and information sharing, capable of manifesting business strategies and highlight aspects of the brand and culture of the Company. The training/information on behaviors and basic rules to follow when sharing the same space, the business reasons that lead the organization to change the workspace, along with the advantages that both parties can benefit from the new concept, influence positively or negatively the perception of change and the emotional state of workers. Noise, temperature, concentration or privacy, are some of the factors that may vary with the layout and function as constraints in a greater or lesser environmental satisfaction. However, there are always issues that remain outstanding and it was in this context that we made some proposals for further research in a scientific paper that never runs out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For derived flood frequency analysis based on hydrological modelling long continuous precipitation time series with high temporal resolution are needed. Often, the observation network with recording rainfall gauges is poor, especially regarding the limited length of the available rainfall time series. Stochastic precipitation synthesis is a good alternative either to extend or to regionalise rainfall series to provide adequate input for long-term rainfall-runoff modelling with subsequent estimation of design floods. Here, a new two step procedure for stochastic synthesis of continuous hourly space-time rainfall is proposed and tested for the extension of short observed precipitation time series. First, a single-site alternating renewal model is presented to simulate independent hourly precipitation time series for several locations. The alternating renewal model describes wet spell durations, dry spell durations and wet spell intensities using univariate frequency distributions separately for two seasons. The dependence between wet spell intensity and duration is accounted for by 2-copulas. For disaggregation of the wet spells into hourly intensities a predefined profile is used. In the second step a multi-site resampling procedure is applied on the synthetic point rainfall event series to reproduce the spatial dependence structure of rainfall. Resampling is carried out successively on all synthetic event series using simulated annealing with an objective function considering three bivariate spatial rainfall characteristics. In a case study synthetic precipitation is generated for some locations with short observation records in two mesoscale catchments of the Bode river basin located in northern Germany. The synthetic rainfall data are then applied for derived flood frequency analysis using the hydrological model HEC-HMS. The results show good performance in reproducing average and extreme rainfall characteristics as well as in reproducing observed flood frequencies. The presented model has the potential to be used for ungauged locations through regionalisation of the model parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Every space launch increases the overall amount of space debris. Satellites have limited awareness of nearby objects that might pose a collision hazard. Astrometric, radiometric, and thermal models for the study of space debris in low-Earth orbit have been developed. This modeled approach proposes analysis methods that provide increased Local Area Awareness for satellites in low-Earth and geostationary orbit. Local Area Awareness is defined as the ability to detect, characterize, and extract useful information regarding resident space objects as they move through the space environment surrounding a spacecraft. The study of space debris is of critical importance to all space-faring nations. Characterization efforts are proposed using long-wave infrared sensors for space-based observations of debris objects in low-Earth orbit. Long-wave infrared sensors are commercially available and do not require solar illumination to be observed, as their received signal is temperature dependent. The characterization of debris objects through means of passive imaging techniques allows for further studies into the origination, specifications, and future trajectory of debris objects. Conclusions are made regarding the aforementioned thermal analysis as a function of debris orbit, geometry, orientation with respect to time, and material properties. Development of a thermal model permits the characterization of debris objects based upon their received long-wave infrared signals. Information regarding the material type, size, and tumble-rate of the observed debris objects are extracted. This investigation proposes the utilization of long-wave infrared radiometric models of typical debris to develop techniques for the detection and characterization of debris objects via signal analysis of unresolved imagery. Knowledge regarding the orbital type and semi-major axis of the observed debris object are extracted via astrometric analysis. This knowledge may aid in the constraint of the admissible region for the initial orbit determination process. The resultant orbital information is then fused with the radiometric characterization analysis enabling further characterization efforts of the observed debris object. This fused analysis, yielding orbital, material, and thermal properties, significantly increases a satellite’s Local Area Awareness via an intimate understanding of the debris environment surrounding the spacecraft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We measured the distribution in absolute magnitude - circular velocity space for a well-defined sample of 199 rotating galaxies of the Calar Alto Legacy Integral Field Area Survey (CALIFA) using their stellar kinematics. Our aim in this analysis is to avoid subjective selection criteria and to take volume and large-scale structure factors into account. Using stellar velocity fields instead of gas emission line kinematics allows including rapidly rotating early-type galaxies. Our initial sample contains 277 galaxies with available stellar velocity fields and growth curve r-band photometry. After rejecting 51 velocity fields that could not be modelled because of the low number of bins, foreground contamination, or significant interaction, we performed Markov chain Monte Carlo modelling of the velocity fields, from which we obtained the rotation curve and kinematic parameters and their realistic uncertainties. We performed an extinction correction and calculated the circular velocity v_circ accounting for the pressure support of a given galaxy. The resulting galaxy distribution on the M-r - v(circ) plane was then modelled as a mixture of two distinct populations, allowing robust and reproducible rejection of outliers, a significant fraction of which are slow rotators. The selection effects are understood well enough that we were able to correct for the incompleteness of the sample. The 199 galaxies were weighted by volume and large-scale structure factors, which enabled us to fit a volume-corrected Tully-Fisher relation (TFR). More importantly, we also provide the volume-corrected distribution of galaxies in the M_r - v_circ plane, which can be compared with cosmological simulations. The joint distribution of the luminosity and circular velocity space densities, representative over the range of -20 > M_r > -22 mag, can place more stringent constraints on the galaxy formation and evolution scenarios than linear TFR fit parameters or the luminosity function alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The Life-Space Assessment (LSA), developed in the USA, is an instrument focusing on mobility with respect to reaching different areas defined as life-spaces, extending from the room where the person sleeps to mobility outside one's hometown. A newly translated Swedish version of the LSA (LSA-S) has been tested for test-retest reliability, but the validity remains to be tested. The purpose of the present study was to examine the concurrent validity of the LSA-S, by comparing and correlating the LSA scores to other measures of mobility. METHOD: The LSA was included in a population-based study of health, functioning and mobility among older persons in Sweden, and the present analysis comprised 312 community-dwelling participants. To test the concurrent validity, the LSA scores were compared to a number of other mobility-related variables, including the Short Physical Performance Battery (SPPB) as well as "stair climbing", "transfers", "transportation", "food shopping", "travel for pleasure" and "community activities". The LSA total mean scores for different levels of the other mobility-related variables, and measures of correlation were calculated. RESULTS: Higher LSA total mean scores were observed with higher levels of all the other mobility related variables. Most of the correlations between the LSA and the other mobility variables were large (r = 0.5-1.0) and significant at the 0.01 level. The LSA total score, as well as independent life-space and assistive life-space correlated with transportation (0.63, 0.66, 0.64) and food shopping (0.55, 0.58, 0.55). Assistive life-space also correlated with SPPB (0.47). With respect to maximal life-space, the correlations with the mobility-related variables were generally lower (below 0.5), probably since this aspect of life-space mobility is highly influenced by social support and is not so dependent on the individual's own physical function. CONCLUSION: LSA was shown to be a valid measure of mobility when using the LSA total, independent LS or assistive LSA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Declarative techniques such as Constraint Programming can be very effective in modeling and assisting management decisions. We present a method for managing university classrooms which extends the previous design of a Constraint-Informed Information System to generate the timetables while dealing with spatial resource optimization issues. We seek to maximize space utilization along two dimensions: classroom use and occupancy rates. While we want to maximize the room use rate, we still need to satisfy the soft constraints which model students’ and lecturers’ preferences. We present a constraint logic programming-based local search method which relies on an evaluation function that combines room utilization and timetable soft preferences. Based on this, we developed a tool which we applied to the improvement of classroom allocation in a University. Comparing the results to the current timetables obtained without optimizing space utilization, the initial versions of our tool manages to reach a 30% improvement in space utilization, while preserving the quality of the timetable, both for students and lecturers.