66 resultados para Short take-off and landing aircraft.
Resumo:
Prediction mechanism is necessary for human visual motion to compensate a delay of sensory-motor system. In a previous study, “proactive control” was discussed as one example of predictive function of human beings, in which motion of hands preceded the virtual moving target in visual tracking experiments. To study the roles of the positional-error correction mechanism and the prediction mechanism, we carried out an intermittently-visual tracking experiment where a circular orbit is segmented into the target-visible regions and the target-invisible regions. Main results found in this research were following. A rhythmic component appeared in the tracer velocity when the target velocity was relatively high. The period of the rhythm in the brain obtained from environmental stimuli is shortened more than 10%. The shortening of the period of rhythm in the brain accelerates the hand motion as soon as the visual information is cut-off, and causes the precedence of hand motion to the target motion. Although the precedence of the hand in the blind region is reset by the environmental information when the target enters the visible region, the hand motion precedes the target in average when the predictive mechanism dominates the error-corrective mechanism.
Resumo:
Simulations of polar ozone losses were performed using the three-dimensional high-resolution (1∘ × 1∘) chemical transport model MIMOSA-CHIM. Three Arctic winters 1999–2000, 2001–2002, 2002–2003 and three Antarctic winters 2001, 2002, and 2003 were considered for the study. The cumulative ozone loss in the Arctic winter 2002–2003 reached around 35% at 475 K inside the vortex, as compared to more than 60% in 1999–2000. During 1999–2000, denitrification induces a maximum of about 23% extra ozone loss at 475 K as compared to 17% in 2002–2003. Unlike these two colder Arctic winters, the 2001–2002 Arctic was warmer and did not experience much ozone loss. Sensitivity tests showed that the chosen resolution of 1∘ × 1∘ provides a better evaluation of ozone loss at the edge of the polar vortex in high solar zenith angle conditions. The simulation results for ozone, ClO, HNO3, N2O, and NO y for winters 1999–2000 and 2002–2003 were compared with measurements on board ER-2 and Geophysica aircraft respectively. Sensitivity tests showed that increasing heating rates calculated by the model by 50% and doubling the PSC (Polar Stratospheric Clouds) particle density (from 5 × 10−3 to 10−2 cm−3) refines the agreement with in situ ozone, N2O and NO y levels. In this configuration, simulated ClO levels are increased and are in better agreement with observations in January but are overestimated by about 20% in March. The use of the Burkholder et al. (1990) Cl2O2 absorption cross-sections slightly increases further ClO levels especially in high solar zenith angle conditions. Comparisons of the modelled ozone values with ozonesonde measurement in the Antarctic winter 2003 and with Polar Ozone and Aerosol Measurement III (POAM III) measurements in the Antarctic winters 2001 and 2002, shows that the simulations underestimate the ozone loss rate at the end of the ozone destruction period. A slightly better agreement is obtained with the use of Burkholder et al. (1990) Cl2O2 absorption cross-sections.
Resumo:
Approaches to natural resource management emphasise the importance of involving local people and institutions in order to build capacity, limit costs, and achieve environmental sustainability. Governments worldwide, often encouraged by international donors, have formulated devolution policies and legal instruments that provide an enabling environment for devolved natural resource management. However, implementation of these policies reveals serious challenges. This article explores the effects of limited involvement of local people and institutions in policy development and implementation. An in-depth study of the Forest Policy of Malawi and Village Forest Areas in the Lilongwe district provides an example of externally driven policy development which seeks to promote local management of natural resources. The article argues that policy which has weak ownership by national government and does not adequately consider the complexity of local institutions, together with the effects of previous initiatives on them, can create a cumulative legacy through which destructive resource use practices and social conflict may be reinforced. In short, poorly developed and implemented community based natural resource management policies can do considerably more harm than good. Approaches are needed that enable the policy development process to embed an in-depth understanding of local institutions whilst incorporating flexibility to account for their location-specific nature. This demands further research on policy design to enable rigorous identification of positive and negative institutions and ex-ante exploration of the likely effects of different policy interventions.
Resumo:
Biodiversity informatics plays a central enabling role in the research community's efforts to address scientific conservation and sustainability issues. Great strides have been made in the past decade establishing a framework for sharing data, where taxonomy and systematics has been perceived as the most prominent discipline involved. To some extent this is inevitable, given the use of species names as the pivot around which information is organised. To address the urgent questions around conservation, land-use, environmental change, sustainability, food security and ecosystem services that are facing Governments worldwide, we need to understand how the ecosystem works. So, we need a systems approach to understanding biodiversity that moves significantly beyond taxonomy and species observations. Such an approach needs to look at the whole system to address species interactions, both with their environment and with other species.It is clear that some barriers to progress are sociological, basically persuading people to use the technological solutions that are already available. This is best addressed by developing more effective systems that deliver immediate benefit to the user, hiding the majority of the technology behind simple user interfaces. An infrastructure should be a space in which activities take place and, as such, should be effectively invisible.This community consultation paper positions the role of biodiversity informatics, for the next decade, presenting the actions needed to link the various biodiversity infrastructures invisibly and to facilitate understanding that can support both business and policy-makers. The community considers the goal in biodiversity informatics to be full integration of the biodiversity research community, including citizens' science, through a commonly-shared, sustainable e-infrastructure across all sub-disciplines that reliably serves science and society alike.
Resumo:
Grassroots innovations emerge as networks generating innovative solutions for climate change adaptation and mitigation. However, it is unclear if grassroots innovations can be successful in responding to climate change. Little evidence exists on replication, international comparisons are rare, and research tends to overlook discontinued responses in favour of successful ones. We take the Transition Movement as a case study of a rapidly spreading transnational grassroots network, and include both active and non-active local transition initiatives. We investigate the replication of grassroots innovations in different contexts with the aim to uncover general patterns of success and failure, and identify questions for future research. An online survey was carried out in 23 countries (N=276). The data analysis entailed testing the effect of internal and contextual factors of success as drawn from the existing literature, and the identification of clusters of transition initiatives with similar internal and contextual factor configurations. Most transition initiatives consider themselves successful. Success is defined along the lines of social connectivity and empowerment, and external environmental impact. We find that less successful transition initiatives might underestimate the importance of contextual factors and material resources in influencing success. We also find that their diffusion is linked to the combination of local-global learning processes, and that there is an incubation period during which a transition initiative is consolidated. Transition initiatives seem capable of generalising organisational principles derived from unique local experiences that seem to be effective in other local contexts. However, the geographical locations matter with regard to where transition initiatives take root and the extent of their success, and ‘place attachment’ may have a role in the diffusion of successful initatives. We suggest that longitudinal comparative studies can advance our understanding in this regard, as well as inform the changing nature of the definition of success at different stages of grassroots innovation development, and the dynamic nature of local and global linkages.
Resumo:
It is increasingly important to know about when energy is used in the home, at work and on the move. Issues of time and timing have not featured strongly in energy policy analysis and in modelling, much of which has focused on estimating and reducing total average annual demand per capita. If smarter ways of balancing supply and demand are to take hold, and if we are to make better use of decarbonised forms of supply, it is essential to understand and intervene in patterns of societal synchronisation. This calls for detailed knowledge of when, and on what occasions many people engage in the same activities at the same time, of how such patterns are changing, and of how might they be shaped. In addition, the impact of smart meters and controls partly depends on whether there is, in fact scope for shifting the timing of what people do, and for changing the rhythm of the day. Is the scheduling of daily life an arena that policy can influence, and if so how? The DEMAND Centre has been linking time use, energy consumption and travel diary data as a means of addressing these questions and in this working paper we present some of the issues and results arising from that exercise.
Resumo:
Measurements of the ionospheric E region during total solar eclipses in the period 1932-1999 have been used to investigate the fraction of Extreme Ultra Violet and soft X-ray radiation, phi, that is emitted from the limb corona and chromosphere. The relative apparent sizes of the Moon and the Sun are different for each eclipse, and techniques are presented which correct the measurements and, therefore, allow direct comparisons between different eclipses. The results show that the fraction of ionising radiation emitted by the limb corona has a clear solar cycle variation and that the underlying trend shows this fraction has been increasing since 1932. Data from the SOHO spacecraft are used to study the effects of short-term variability and it is shown that the observed long-term rise in phi has a negligible probability of being a chance occurrence.
Resumo:
We investigated the plume structure of a piezo-electric sprayer system, set up to release ethanol in a wind tunnel, using a fast response mini-photoionizaton detector. We recorded the plume structure of four different piezo-sprayer configurations: the sprayer alone; with a 1.6-mm steel mesh shield; with a 3.2-mm steel mesh shield; and with a 5 cm circular upwind baffle. We measured a 12 × 12-mm core at the center of the plume, and both a horizontal and vertical cross-section of the plume, all at 100-, 200-, and 400-mm downwind of the odor source. Significant differences in plume structure were found among all configurations in terms of conditional relative mean concentration, intermittency, ratio of peak concentration to conditional mean concentration, and cross-sectional area of the plume. We then measured the flight responses of the almond moth, Cadra cautella, to odor plumes generated with the sprayer alone, and with the upwind baffle piezo-sprayer configuration, releasing a 13:1 ratio of (9Z,12E)-tetradecadienyl acetate and (Z)-9-tetradecenyl acetate diluted in ethanol at release rates of 1, 10, 100, and 1,000 pg/min. For each configuration, differences in pheromone release rate resulted in significant differences in the proportions of moths performing oriented flight and landing behaviors. Additionally, there were apparent differences in the moths’ behaviors between the two sprayer configurations, although this requires confirmation with further experiments. This study provides evidence that both pheromone concentration and plume structure affect moth orientation behavior and demonstrates that care is needed when setting up experiments that use a piezo-electric release system to ensure the optimal conditions for behavioral observations.
Sensitivity of resolved and parameterized surface drag to changes in resolution and parameterization
Resumo:
The relative contribution of resolved and parameterized surface drag towards balancing the atmospheric angular momentum flux convergence (AMFC), and their sensitivity to horizontal resolution and parameterization, are investigated in an atmospheric model. This sensitivity can be difficult to elucidate in free-running climate models, in which the AMFC varies with changing climatologies and, as a result, the relative contributions of surface terms balancing the AMFC also vary. While the sensitivity question has previously been addressed using short-range forecasts, we demonstrate that a nudging framework is an effective method for constraining the AMFC. The Met Office Unified Model is integrated at three horizontal resolutions ranging from 130 km (N96) to 25 km (N512) while relaxing the model’s wind and temperature fields towards the ERAinterim reanalysis within the altitude regions of maximum AMFC. This method is validated against short range forecasts and good agreement is found. These experiments are then used to assess the fidelity of the exchange between parameterized and resolved orographic torques with changes in horizontal resolution. Although the parameterized orographic torque reduces substantially with increasing horizontal resolution, there is little change in resolved orographic torque over 20N to 50N. The tendencies produced by the nudging routine indicate that the additional drag at lower horizontal resolution is excessive. When parameterized orographic blocking is removed at the coarsest of these resolutions, there is a lack of compensation, and even compensation of the opposite sense, by the boundary layer and resolved torques which is particularly pronounced over 20N to 50N. This study demonstrates that there is strong sensitivity in the behaviour of the resolved and parameterized surface drag over this region.
Resumo:
Background: We report an analysis of a protein network of functionally linked proteins, identified from a phylogenetic statistical analysis of complete eukaryotic genomes. Phylogenetic methods identify pairs of proteins that co-evolve on a phylogenetic tree, and have been shown to have a high probability of correctly identifying known functional links. Results: The eukaryotic correlated evolution network we derive displays the familiar power law scaling of connectivity. We introduce the use of explicit phylogenetic methods to reconstruct the ancestral presence or absence of proteins at the interior nodes of a phylogeny of eukaryote species. We find that the connectivity distribution of proteins at the point they arise on the tree and join the network follows a power law, as does the connectivity distribution of proteins at the time they are lost from the network. Proteins resident in the network acquire connections over time, but we find no evidence that 'preferential attachment' - the phenomenon of newly acquired connections in the network being more likely to be made to proteins with large numbers of connections - influences the network structure. We derive a 'variable rate of attachment' model in which proteins vary in their propensity to form network interactions independently of how many connections they have or of the total number of connections in the network, and show how this model can produce apparent power-law scaling without preferential attachment. Conclusion: A few simple rules can explain the topological structure and evolutionary changes to protein-interaction networks: most change is concentrated in satellite proteins of low connectivity and small phenotypic effect, and proteins differ in their propensity to form attachments. Given these rules of assembly, power law scaled networks naturally emerge from simple principles of selection, yielding protein interaction networks that retain a high-degree of robustness on short time scales and evolvability on longer evolutionary time scales.
Resumo:
One of the most common decisions we make is the one about where to move our eyes next. Here we examine the impact that processing the evidence supporting competing options has on saccade programming. Participants were asked to saccade to one of two possible visual targets indicated by a cloud of moving dots. We varied the evidence which supported saccade target choice by manipulating the proportion of dots moving towards one target or the other. The task was found to become easier as the evidence supporting target choice increased. This was reflected in an increase in percent correct and a decrease in saccade latency. The trajectory and landing position of saccades were found to deviate away from the non-selected target reflecting the choice of the target and the inhibition of the non-target. The extent of the deviation was found to increase with amount of sensory evidence supporting target choice. This shows that decision-making processes involved in saccade target choice have an impact on the spatial control of a saccade. This would seem to extend the notion of the processes involved in the control of saccade metrics beyond a competition between visual stimuli to one also reflecting a competition between options.
Resumo:
Inhibition is intimately involved in the ability to select a target for a goal-directed movement. The effect of distracters on the deviation of oculomotor trajectories and landing positions provides evidence of such inhibition. individual saccade trajectories and landing positions may deviate initially either towards, or away from, a competing distracter-the direction and extent of this deviation depends upon saccade latency and the target to distracter separation. However, the underlying commonality of the sources of oculomotor inhibition has not been investigated. Here we report the relationship between distracter-related deviation of saccade trajectory, landing position and saccade latency. Observers saccaded to a target which could be accompanied by a distracter shown at various distances from very close (10 angular degrees) to far away (120 angular degrees). A fixation-gap paradigm was used to manipulate latency independently of the influence of competing distracters. When distracters were close to the target, saccade trajectory and landing position deviated toward the distracter position, while at greater separations landing position was always accurate but trajectories deviated away from the distracters. Different spatial patterns of deviations across latency were found. This pattern of results is consistent with the metrics of the saccade reflecting coarse pooling of the ongoing activity at the distracter location: saccade trajectory reflects activity at saccade initiation while landing position reveals activity at saccade end. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The purpose of this paper is to design a control law for continuous systems with Boolean inputs allowing the output to track a desired trajectory. Such systems are controlled by items of commutation. This type of systems, with Boolean inputs, has found increasing use in the electric industry. Power supplies include such systems and a power converter represents one of theses systems. For instance, in power electronics the control variable is the switching OFF and ON of components such as thyristors or transistors. In this paper, a method is proposed for the designing of a control law in state space for such systems. This approach is implemented in simulation for the control of an electronic circuit.
Resumo:
This paper examines the short and long-term persistence of tax-exempt real estate funds in the UK through the use of winner-loser contingency table methodology. The persistence tests are applied to a database of varying numbers of funds from a low of 16 to a high of 27 using quarterly returns over the 12 years from 1990 Q1 to 2001 Q4. The overall conclusion is that the real estate funds in the UK show little evidence of persistence in the short-term (quarterly and semi-annual data) or for data over a considerable length of time (bi-annual to six yearly intervals). In contrast, the results are better for annual data with evidence of significant performance persistence. Thus at this stage, it seems that an annual evaluation period, provides the best discrimination of the winner and loser phenomenon in the real estate market. This result is different from equity and bond studies, where it seems that the repeat winner phenomenon is stronger over shorter periods of evaluation. These results require careful interpretation, however, as the results show that when only small samples are used significant adjustments must be made to correct for small sample bias and second the conclusions are sensitive to the length of the evaluation period and specific test used. Nonetheless, it seems that persistence in performance of real estate funds in the UK does exist, at least for the annual data, and it appears to be a guide to beating the pack in the long run. Furthermore, although the evidence of persistence in performance for the overall sample of funds is limited, we have found evidence that two funds were consistent winners over this period, whereas no one fund could be said to be a consistent loser.
Resumo:
During April-May 2010 volcanic ash clouds from the Icelandic Eyjafjallajökull volcano reached Europe causing an unprecedented disruption of the EUR/NAT region airspace. Civil aviation authorities banned all flight operations because of the threat posed by volcanic ash to modern turbine aircraft. New quantitative airborne ash mass concentration thresholds, still under discussion, were adopted for discerning regions contaminated by ash. This has implications for ash dispersal models routinely used to forecast the evolution of ash clouds. In this new context, quantitative model validation and assessment of the accuracies of current state-of-the-art models is of paramount importance. The passage of volcanic ash clouds over central Europe, a territory hosting a dense network of meteorological and air quality observatories, generated a quantity of observations unusual for volcanic clouds. From the ground, the cloud was observed by aerosol lidars, lidar ceilometers, sun photometers, other remote-sensing instru- ments and in-situ collectors. From the air, sondes and multiple aircraft measurements also took extremely valuable in-situ and remote-sensing measurements. These measurements constitute an excellent database for model validation. Here we validate the FALL3D ash dispersal model by comparing model results with ground and airplane-based measurements obtained during the initial 14e23 April 2010 Eyjafjallajökull explosive phase. We run the model at high spatial resolution using as input hourly- averaged observed heights of the eruption column and the total grain size distribution reconstructed from field observations. Model results are then compared against remote ground-based and in-situ aircraft-based measurements, including lidar ceilometers from the German Meteorological Service, aerosol lidars and sun photometers from EARLINET and AERONET networks, and flight missions of the German DLR Falcon aircraft. We find good quantitative agreement, with an error similar to the spread in the observations (however depending on the method used to estimate mass eruption rate) for both airborne and ground mass concentration. Such verification results help us understand and constrain the accuracy and reliability of ash transport models and it is of enormous relevance for designing future operational mitigation strategies at Volcanic Ash Advisory Centers.