963 resultados para Corn and Brachiaria - Intercropping systems
Large-scale atmospheric dynamics of the wet winter 2009–2010 and its impact on hydrology in Portugal
Resumo:
The anomalously wet winter of 2010 had a very important impact on the Portuguese hydrological system. Owing to the detrimental effects of reduced precipitation in Portugal on the environmental and socio-economic systems, the 2010 winter was predominantly beneficial by reversing the accumulated precipitation deficits during the previous hydrological years. The recorded anomalously high precipitation amounts have contributed to an overall increase in river runoffs and dam recharges in the 4 major river basins. In synoptic terms, the winter 2010 was characterised by an anomalously strong westerly flow component over the North Atlantic that triggered high precipitation amounts. A dynamically coherent enhancement in the frequencies of mid-latitude cyclones close to Portugal, also accompanied by significant increases in the occurrence of cyclonic, south and south-westerly circulation weather types, are noteworthy. Furthermore, the prevalence of the strong negative phase of the North Atlantic Oscillation (NAO) also emphasises the main dynamical features of the 2010 winter. A comparison of the hydrological and atmospheric conditions between the 2010 winter and the previous 2 anomalously wet winters (1996 and 2001) was also carried out to isolate not only their similarities, but also their contrasting conditions, highlighting the limitations of estimating winter precipitation amounts in Portugal using solely the NAO phase as a predictor.
Resumo:
Northern Hemisphere cyclone activity is assessed by applying an algorithm for the detection and tracking of synoptic scale cyclones to mean sea level pressure data. The method, originally developed for the Southern Hemisphere, is adapted for application in the Northern Hemisphere winter season. NCEP-Reanalysis data from 1958/59 to 1997/98 are used as input. The sensitivities of the results to particular parameters of the algorithm are discussed for both case studies and from a climatological point of view. Results show that the choice of settings is of major relevance especially for the tracking of smaller scale and fast moving systems. With an appropriate setting the algorithm is capable of automatically tracking different types of cyclones at the same time: Both fast moving and developing systems over the large ocean basins and smaller scale cyclones over the Mediterranean basin can be assessed. The climatology of cyclone variables, e.g., cyclone track density, cyclone counts, intensification rates, propagation speeds and areas of cyclogenesis and -lysis gives detailed information on typical cyclone life cycles for different regions. The lowering of the spatial and temporal resolution of the input data from full resolution T62/06h to T42/12h decreases the cyclone track density and cyclone counts. Reducing the temporal resolution alone contributes to a decline in the number of fast moving systems, which is relevant for the cyclone track density. Lowering spatial resolution alone mainly reduces the number of weak cyclones.
The capability-affordance model: a method for analysis and modelling of capabilities and affordances
Resumo:
Existing capability models lack qualitative and quantitative means to compare business capabilities. This paper extends previous work and uses affordance theories to consistently model and analyse capabilities. We use the concept of objective and subjective affordances to model capability as a tuple of a set of resource affordance system mechanisms and action paths, dependent on one or more critical affordance factors. We identify an affordance chain of subjective affordances by which affordances work together to enable an action and an affordance path that links action affordances to create a capability system. We define the mechanism and path underlying capability. We show how affordance modelling notation, AMN, can represent affordances comprising a capability. We propose a method to quantitatively and qualitatively compare capabilities using efficiency, effectiveness and quality metrics. The method is demonstrated by a medical example comparing the capability of syringe and needless anaesthetic systems.
Resumo:
The financial crisis of 2007–2009 and the resultant pressures exerted on policymakers to prevent future crises have precipitated coordinated regulatory responses globally. A key focus of the new wave of regulation is to ensure the removal of practices now deemed problematic with new controls for conducting transactions and maintaining holdings. There is increasing pressure on organizations to retire manual processes and adopt core systems, such as Investment Management Systems (IMS). These systems facilitate trading and ensure transactions are compliant by transcribing regulatory requirements into automated rules and applying them to trades. The motivation of this study is to explore the extent to which such systems may enable the alteration of previously embedded practices. We researched implementations of an IMS at eight global financial organizations and found that overall the IMS encourages responsible trading through surveillance, monitoring and the automation of regulatory rules and that such systems are likely to become further embedded within financial organizations. We found evidence that some older practices persisted. Our study suggests that the institutionalization of technology-induced compliant behaviour is still uncertain.
Resumo:
Tremendous progress in plant proteomics driven by mass spectrometry (MS) techniques has been made since 2000 when few proteomics reports were published and plant proteomics was in its infancy. These achievements include the refinement of existing techniques and the search for new techniques to address food security, safety, and health issues. It is projected that in 2050, the world’s population will reach 9–12 billion people demanding a food production increase of 34–70% (FAO, 2009) from today’s food production. Provision of food in a sustainable and environmentally committed manner for such a demand without threatening natural resources, requires that agricultural production increases significantly and that postharvest handling and food manufacturing systems become more efficient requiring lower energy expenditure, a decrease in postharvest losses, less waste generation and food with longer shelf life. There is also a need to look for alternative protein sources to animal based (i.e., plant based) to be able to fulfill the increase in protein demands by 2050. Thus, plant biology has a critical role to play as a science capable of addressing such challenges. In this review, we discuss proteomics especially MS, as a platform, being utilized in plant biology research for the past 10 years having the potential to expedite the process of understanding plant biology for human benefits. The increasing application of proteomics technologies in food security, analysis, and safety is emphasized in this review. But, we are aware that no unique approach/technology is capable to address the global food issues. Proteomics-generated information/resources must be integrated and correlated with other omics-based approaches, information, and conventional programs to ensure sufficient food and resources for human development now and in the future.
Resumo:
Proteolytic enzymes comprise approximately 2 percent of the human genome [1]. Given their abundance, it is not surprising that proteases have diverse biological functions, ranging from the degradation of proteins in lysosomes to the control of physiological processes such as the coagulation cascade. However, a subset of serine proteases (possessing serine residues within their catalytic sites), which may be soluble in the extracellular fluid or tethered to the plasma membrane, are signaling molecules that can specifically regulate cells by cleaving protease-activated receptors (PARs), a family of four G-protein-coupled receptors (GPCRs). These serine proteases include members of the coagulation cascade (e.g., thrombin, factor VIIa, and factor Xa), proteases from inflammatory cells (e.g., mast cell tryptase, neutrophil cathepsin G), and proteases from epithelial tissues and neurons (e.g., trypsins). They are often generated or released during injury and inflammation, and they cleave PARs on multiple cell types, including platelets, endothelial and epithelial cells, myocytes, fibroblasts, and cells of the nervous system. Activated PARs regulate many essential physiological processes, such as hemostasis, inflammation, pain, and healing. These proteases and their receptors have been implicated in human disease and are potentially important targets for therapy. Proteases and PARs participate in regulating most organ systems and are the subject of several comprehensive reviews [2, 3]. Within the central and peripheral nervous systems, proteases and PARs can control neuronal and astrocyte survival, proliferation and morphology, release of neurotransmitters, and the function and activity of ion channels, topics that have also been comprehensively reviewed [4, 5]. This chapter specifically concerns the ability of PARs to regulate TRPV channels of sensory neurons and thereby affect neurogenic inflammation and pain transmission [6, 7].
Resumo:
We have extensively evaluated the response of cloud-base drizzle rate (Rcb; mm day–1) in warm clouds to liquid water path (LWP; g m–2) and to cloud condensation nuclei (CCN) number concentration (NCCN; cm–3), an aerosol proxy. This evaluation is based on a 19-month long dataset of Doppler radar, lidar, microwave radiometers and aerosol observing systems from the Atmospheric Radiation Measurement (ARM) Mobile Facility deployments at the Azores and in Germany. Assuming 0.55% supersaturation to calculate NCCN, we found a power law , indicating that Rcb decreases by a factor of 2–3 as NCCN increases from 200 to 1000 cm–3 for fixed LWP. Additionally, the precipitation susceptibility to NCCN ranges between 0.5 and 0.9, in agreement with values from simulations and aircraft measurements. Surprisingly, the susceptibility of the probability of precipitation from our analysis is much higher than that from CloudSat estimates, but agrees well with simulations from a multi-scale high-resolution aerosol-climate model. Although scale issues are not completely resolved in the intercomparisons, our results are encouraging, suggesting that it is possible for multi-scale models to accurately simulate the response of LWP to aerosol perturbations.
Resumo:
Radar refractivity retrievals can capture near-surface humidity changes, but noisy phase changes of the ground clutter returns limit the accuracy for both klystron- and magnetron-based systems. Observations with a C-band (5.6 cm) magnetron weather radar indicate that the correction for phase changes introduced by local oscillator frequency changes leads to refractivity errors no larger than 0.25 N units: equivalent to a relative humidity change of only 0.25% at 20°C. Requested stable local oscillator (STALO) frequency changes were accurate to 0.002 ppm based on laboratory measurements. More serious are the random phase change errors introduced when targets are not at the range-gate center and there are changes in the transmitter frequency (ΔfTx) or the refractivity (ΔN). Observations at C band with a 2-μs pulse show an additional 66° of phase change noise for a ΔfTx of 190 kHz (34 ppm); this allows the effect due to ΔN to be predicted. Even at S band with klystron transmitters, significant phase change noise should occur when a large ΔN develops relative to the reference period [e.g., ~55° when ΔN = 60 for the Next Generation Weather Radar (NEXRAD) radars]. At shorter wavelengths (e.g., C and X band) and with magnetron transmitters in particular, refractivity retrievals relative to an earlier reference period are even more difficult, and operational retrievals may be restricted to changes over shorter (e.g., hourly) periods of time. Target location errors can be reduced by using a shorter pulse or identified by a new technique making alternate measurements at two closely spaced frequencies, which could even be achieved with a dual–pulse repetition frequency (PRF) operation of a magnetron transmitter.
Resumo:
Web service is one of the most fundamental technologies in implementing service oriented architecture (SOA) based applications. One essential challenge related to web service is to find suitable candidates with regard to web service consumer’s requests, which is normally called web service discovery. During a web service discovery protocol, it is expected that the consumer will find it hard to distinguish which ones are more suitable in the retrieval set, thereby making selection of web services a critical task. In this paper, inspired by the idea that the service composition pattern is significant hint for service selection, a personal profiling mechanism is proposed to improve ranking and recommendation performance. Since service selection is highly dependent on the composition process, personal knowledge is accumulated from previous service composition process and shared via collaborative filtering where a set of users with similar interest will be firstly identified. Afterwards a web service re-ranking mechanism is employed for personalised recommendation. Experimental studies are conduced and analysed to demonstrate the promising potential of this research.
Resumo:
Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.
Resumo:
The DIAMET (DIAbatic influences on Mesoscale structures in ExTratropical storms) project aims to improve forecasts of high-impact weather in extratropical cyclones through field measurements, high-resolution numerical modeling, and improved design of ensemble forecasting and data assimilation systems. This article introduces DIAMET and presents some of the first results. Four field campaigns were conducted by the project, one of which, in late 2011, coincided with an exceptionally stormy period marked by an unusually strong, zonal North Atlantic jet stream and a succession of severe windstorms in northwest Europe. As a result, December 2011 had the highest monthly North Atlantic Oscillation index (2.52) of any December in the last 60 years. Detailed observations of several of these storms were gathered using the UK’s BAe146 research aircraft and extensive ground-based measurements. As an example of the results obtained during the campaign, observations are presented of cyclone Friedhelm on 8 December 2011, when surface winds with gusts exceeding 30 m s-1 crossed central Scotland, leading to widespread disruption to transportation and electricity supply. Friedhelm deepened 44 hPa in 24 hours and developed a pronounced bent-back front wrapping around the storm center. The strongest winds at 850 hPa and the surface occurred in the southern quadrant of the storm, and detailed measurements showed these to be most intense in clear air between bands of showers. High-resolution ensemble forecasts from the Met Office showed similar features, with the strongest winds aligned in linear swaths between the bands, suggesting that there is potential for improved skill in forecasts of damaging winds.
Resumo:
Fracking in England has been the subject of significant controversy and has sparked not only public protest but also an associated framing war with differing social constructions of the technology adopted by different sides. This article explores the frames and counter-frames which have been employed by both the anti-fracking movement and by government and the oil and gas industry. It then considers the way in which the English planning and regulatory permitting systems have provided space for these frames within the relevant machinery for public participation. The article thus enables one to see which frames have been allowed a voice and which have been excluded.
Resumo:
The nuclear time-dependent Hartree-Fock model formulated in three-dimensional space, based on the full standard Skyrme energy density functional complemented with the tensor force, is presented. Full self-consistency is achieved by the model. The application to the isovector giant dipole resonance is discussed in the linear limit, ranging from spherical nuclei (16O and 120Sn) to systems displaying axial or triaxial deformation (24Mg, 28Si, 178Os, 190W and 238U). Particular attention is paid to the spin-dependent terms from the central sector of the functional, recently included together with the tensor. They turn out to be capable of producing a qualitative change on the strength distribution in this channel. The effect on the deformation properties is also discussed. The quantitative effects on the linear response are small and, overall, the giant dipole energy remains unaffected. Calculations are compared to predictions from the (quasi)-particle random-phase approximation and experimental data where available, finding good agreement
Resumo:
Interest in sustainable farming methods that rely on alternatives to conventional synthetic fertilizers and pesticides is increasing. Sustainable farming methods often utilize natural populations of predatory and parasitic species to control populations of herbivores, which may be potential pest species. We investigated the effects of several types of fertilizer, including those typical of sustainable and conventional farming systems, on the interaction between a herbivore and parasitoid. The effects of fertilizer type on percentage parasitism, parasitoid performance, parasitoid attack behaviour and responses to plant volatiles were examined using a model Brassica system, consisting of Brassica oleracea var capitata, Plutella xylostella (Lepidoptera) larvae and Cotesia vestalis (parasitoid). Percentage parasitism was greatest for P. xylostella larvae feeding on plants that had received either a synthetic ammonium nitrate fertilizer or were unfertilized, in comparison to those receiving a composite fertilizer containing hoof and horn. Parasitism was intermediate on plants fertilized with an organically produced animal manure. Male parasitoid tibia length showed the same pattern as percentage parasitism, an indication that offspring performance was maximized on the treatments preferred by female parasitoids for oviposition. Percentage parasitism and parasitoid size were not correlated with foliar nitrogen concentration. The parasitoids did not discriminate between hosts feeding on plants in the four fertilizer treatments in parasitoid behaviour assays, but showed a preference for unfertilized plants in olfactometer experiments. The percentage parasitism and tibia length results provide support for the preference–performance hypothesis
Resumo:
An analysis of diabatic heating and moistening processes from 12-36 hour lead time forecasts from 12 Global Circulation Models are presented as part of the "Vertical structure and physical processes of the Madden-Julian Oscillation (MJO)" project. A lead time of 12-36 hours is chosen to constrain the large scale dynamics and thermodynamics to be close to observations while avoiding being too close to the initial spin-up for the models as they adjust to being driven from the YOTC analysis. A comparison of the vertical velocity and rainfall with the observations and YOTC analysis suggests that the phases of convection associated with the MJO are constrained in most models at this lead time although the rainfall in the suppressed phase is typically overestimated. Although the large scale dynamics is reasonably constrained, moistening and heating profiles have large inter-model spread. In particular, there are large spreads in convective heating and moistening at mid-levels during the transition to active convection. Radiative heating and cloud parameters have the largest relative spread across models at upper levels during the active phase. A detailed analysis of time step behaviour shows that some models show strong intermittency in rainfall and differences in the precipitation and dynamics relationship between models. The wealth of model outputs archived during this project is a very valuable resource for model developers beyond the study of the MJO. In addition, the findings of this study can inform the design of process model experiments, and inform the priorities for field experiments and future observing systems.