964 resultados para Systems development
Resumo:
Drought is a global problem that has far-reaching impacts and especially 47 on vulnerable populations in developing regions. This paper highlights the need for a Global Drought Early Warning System (GDEWS), the elements that constitute its underlying framework (GDEWF) and the recent progress made towards its development. Many countries lack drought monitoring systems, as well as the capacity to respond via appropriate political, institutional and technological frameworks, and these have inhibited the development of integrated drought management plans or early warning systems. The GDEWS will provide a source of drought tools and products via the GDEWF for countries and regions to develop tailored drought early warning systems for their own users. A key goal of a GDEWS is to maximize the lead time for early warning, allowing drought managers and disaster coordinators more time to put mitigation measures in place to reduce the vulnerability to drought. To address this, the GDEWF will take both a top-down approach to provide global real-time drought monitoring and seasonal forecasting, and a bottom-up approach that builds upon existing national and regional systems to provide continental to global coverage. A number of challenges must be overcome, however, before a GDEWS can become a reality, including the lack of in-situ measurement networks and modest seasonal forecast skill in many regions, and the lack of infrastructure to translate data into useable information. A set of international partners, through a series of recent workshops and evolving collaborations, has made progress towards meeting these challenges and developing a global system.
Resumo:
Whilst hydrological systems can show resilience to short-term streamflow deficiencies during within-year droughts, prolonged deficits during multi-year droughts are a significant threat to water resources security in Europe. This study uses a threshold-based objective classification of regional hydrological drought to qualitatively examine the characteristics, spatio-temporal evolution and synoptic climatic drivers of multi-year drought events in 1962–64, 1975–76 and 1995–97, on a European scale but with particular focus on the UK. Whilst all three events are multi-year, pan-European phenomena, their development and causes can be contrasted. The critical factor in explaining the unprecedented severity of the 1975–76 event is the consecutive occurrence of winter and summer drought. In contrast, 1962–64 was a succession of dry winters, mitigated by quiescent summers, whilst 1995–97 lacked spatial coherence and was interrupted by wet interludes. Synoptic climatic conditions vary within and between multi-year droughts, suggesting that regional factors modulate the climate signal in streamflow drought occurrence. Despite being underpinned by qualitatively similar climatic conditions and commonalities in evolution and characteristics, each of the three droughts has a unique spatio-temporal signature. An improved understanding of the spatio-temporal evolution and characteristics of multi-year droughts has much to contribute to monitoring and forecasting capability, and to improved mitigation strategies.
Resumo:
Polymers are used in many everyday technologies and their degradation due to environmental exposure has lead to great interest in materials which can heal and repair themselves. In order to design new self healing polymers it's important to understand the fundamental healing mechanisms behind the material.Healable Polymer Systems will outline the key concepts and mechanisms underpinning the design and processing of healable polymers, and indicate potential directions for progress in the future development and applications of these fascinating and potentially valuable materials. The book covers the different techniques developed successfully to date for both autonomous healable materials (those which do not require an external stimulus to promote healing) and rehealable or remendable materials (those which only recover their original physical properties if a specific stimulus is applied). These include the encapsulated-monomer approach, reversible covalent bond formation, irreversible covalent bond formation and supramolecular self-assembly providing detailed insights into their chemistry.Written by leading experts, the book provides polymer scientists with a compact and readily accessible source of reference for healable polymer systems.
Resumo:
The goal of this article is to make an epistemological and theoretical contribution to the nascent field of third language (L3) acquisition and show how examining L3 development can offer a unique view into longstanding debates within L2 acquisition theory. We offer the Phonological Permeability Hypothesis (PPH), which maintains that examining the development of an L3/Ln phonological system and its effects on a previously acquired L2 phonological system can inform contemporary debates regarding the mental constitution of postcritical period adult phonological acquisition. We discuss the predictions and functional significance of the PPH for adult SLA and multilingualism studies, detailing a methodology that examines the effects of acquiring Brazilian Portuguese on the Spanish phonological systems learned before and after the so-called critical period (i.e., comparing simultaneous versus successive adult English-Spanish bilinguals learning Brazilian Portuguese as an L3).
Resumo:
Introduction. Feature usage is a pre-requisite to realising the benefits of investments in feature rich systems. We propose that conceptualising the dependent variable 'system use' as 'level of use' and specifying it as a formative construct has greater value for measuring the post-adoption use of feature rich systems. We then validate the content of the construct as a first step in developing a research instrument to measure it. The context of our study is the post-adoption use of electronic medical records (EMR) by primary care physicians. Method. Initially, a literature review of the empirical context defines the scope based on prior studies. Having identified core features from the literature, they are further refined with the help of experts in a consensus seeking process that follows the Delphi technique. Results.The methodology was successfully applied to EMRs, which were selected as an example of feature rich systems. A review of EMR usage and regulatory standards provided the feature input for the first round of the Delphi process. A panel of experts then reached consensus after four rounds, identifying ten task-based features that would be indicators of level of use. Conclusions. To study why some users deploy more advanced features than others, theories of post-adoption require a rich formative dependent variable that measures level of use. We have demonstrated that a context sensitive literature review followed by refinement through a consensus seeking process is a suitable methodology to validate the content of this dependent variable. This is the first step of instrument development prior to statistical confirmation with a larger sample.
Resumo:
The study of intuition is an emerging area of research in psychology, social sciences, and business studies. It is increasingly of interest to the study of management, for example in decision-making as a counterpoint to structured approaches. Recently work has been undertaken to conceptualize a construct for the intuitive nature of technology. However to-date there is no common under-standing of the term intuition in information systems (IS) research. This paper extends the study of intuition in IS research by using exploratory research to cate-gorize the use of the word “intuition” and related terms in papers published in two prominent IS journals over a ten year period. The entire text of MIS Quarterly and Information Systems Research was reviewed for the years 1999 through 2008 using searchable PDF versions of these publications. As far as could be deter-mined, this is the first application of this approach in the analysis of the text of IS academic journals. The use of the word “intuition” and related terms was catego-rized using coding consistent with Grounded Theory. The focus of this research was on the first two stages of Grounded Theory analysis - the development of codes and constructs. Saturation of coding was not reached: an extended review of these publications would be required to enable theory development. Over 400 incidents of the use of “intuition”, and related terms were found in the articles reviewed. The most prominent use of the term of “intuition” was coded as “Intui-tion as Authority” in which intuition was used to validate a research objective or finding; representing approximately 37 per cent of codes assigned. The second most common coding occurred in research articles with mathematical analysis, representing about 19 per cent of the codes assigned, for example where a ma-thematical formulation or result was “intuitive”. The possibly most impactful use of the term “intuition” was “Intuition as Outcome”, representing approximately 7 per cent of all coding, which characterized research results as adding to the intui-tive understanding of a research topic or phenomena. This research contributes to a greater theoretical understanding of intuition enabling insight into the use of intuition, and the eventual development of a theory on the use of intuition in academic IS research publications. It also provides potential benefits to practi-tioners by providing insight into and validation of the use of intuition in IS man-agement. Research directions include the creation of reflective and/or formative constructs for intuition in information systems research.
Resumo:
Roots are important to plants for a wide variety of processes, including nutrient and water uptake, anchoring and mechanical support, storage functions, and as the major interface between the plant and various biotic and abiotic factors in the soil environment. Therefore, understanding the development and architecture of roots holds potential for the manipulation of root traits to improve the productivity and sustainability of agricultural systems and to better understand and manage natural ecosystems. While lateral root development is a traceable process along the primary root and different stages can be found along this longitudinal axis of time and development, root system architecture is complex and difficult to quantify. Here, we comment on assays to describe lateral root phenotypes and propose ways to move forward regarding the description of root system architecture, also considering crops and the environment.
Resumo:
Previously, governments have responded to the impacts of economic failures and consequently have developed more regulations to protect employees, customers, shareholders and the economic wellbeing of the state. Our research addresses how Accounting Information Systems (AIS) may act as carriers for institutionalised practices associated with maintaining regulatory compliance within the context of UK Asset Management Houses. The AIS was found to be a strong conduit for institutionalized compliance related practices, utilising symbolic systems, relational systems, routines and artefacts to carry approaches relating to regulative, normative and cultural-cognitive strands of institutionalism. Thus, AIS are integral to the development and dissipation of best practice for the management of regulatory compliance. As institutional elements are clearly present we argue that AIS and regulatory compliance provide a rich context to further institutionalism. Since AIS may act as conduits for regulatory approaches, both systems adopters and clients may benefit from actively seeking to codify and abstract best practices into AIS. However, the application of generic institutionalized approaches, which may be applied across similar organizations, must be tempered with each firm’s business environment and associated regulatory exposure. A balance should be sought between approaches specific enough to be useful but generic enough to be universally applied.
Resumo:
The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.
Resumo:
European grassland-based livestock production systems are challenged to produce more milk and meat to meet increasing world demand and to achieve this by using fewer resources. Legumes offer great potential for coping with such requests. They have numerous features that can act together at different stages in the soil-plant-animal-atmosphere system and these are most effective in mixed swards with a legume abundance of 30-50%. The resulting benefits are a reduced dependency on fossil energy and industrial N fertilizer, lower quantities of harmful emissions to the environment (greenhouse gases and nitrate), lower production costs, higher productivity and increased protein self-sufficiency. Some legume species offer opportunities for improving animal health with less medication due to bioactive secondary metabolites. In addition, legumes may offer an option for adapting to higher atmospheric CO2 concentrations and to climate change. Legumes generate these benefits at the level of the managed land area unit and also at the level of the final product unit. However, legumes suffer from some limitations, and suggestions are made for future research in order to exploit more fully the opportunities that legumes can offer. In conclusion, the development of legume-based grassland-livestock systems undoubtedly constitutes one of the pillars for more sustainable and competitive ruminant production systems, and it can only be expected that legumes will become more important in the future.
Resumo:
European grassland-based livestock production systems face the challenge of producing more meat and milk to meet increasing world demands and to achieve this using fewer resources. Legumes offer great potential for achieving these objectives. They have numerous features that can act together at different stages in the soil–plant–animal–atmosphere system, and these are most effective in mixed swards with a legume proportion of 30–50%. The resulting benefits include reduced dependence on fossil energy and industrial N-fertilizer, lower quantities of harmful emissions to the environment (greenhouse gases and nitrate), lower production costs, higher productivity and increased protein self-sufficiency. Some legume species offer opportunities for improving animal health with less medication, due to the presence of bioactive secondary metabolites. In addition, legumes may offer an adaptation option to rising atmospheric CO2 concentrations and climate change. Legumes generate these benefits at the level of the managed land-area unit and also at the level of the final product unit. However, legumes suffer from some limitations, and suggestions are made for future research to exploit more fully the opportunities that legumes can offer. In conclusion, the development of legume-based grassland–livestock systems undoubtedly constitutes one of the pillars for more sustainable and competitive ruminant production systems, and it can be expected that forage legumes will become more important in the future.
Resumo:
Information systems integration aims at the interaction, information exchange and interoperability between information systems, devices and units. Research efforts have contributed in evaluation of information systems integration on the development of evaluation frameworks. To improve the usability and measurability of evaluation, a review of existing evaluation frameworks including their evolution and classifications of different interoperability levels is conducted. The theory of organisational semiotics is used for a comparative analysis of the frameworks and future work.
Resumo:
The modern built environment has become more complex in terms of building types, environmental systems and use profiles. This complexity causes difficulties in terms of optimising buildings energy design. In this circumstance, introducing a set of prototype reference buildings, or so called benchmark buildings, that are able to represent all or majority parts of the UK building stock may be useful for the examination of the impact of national energy policies on building energy consumption. This study proposes a set of reference office buildings for England and Wales based on the information collected from the Non-Domestic Building Stock (NDBS) project and an intensive review of the existing building benchmarks. The proposed building benchmark comprises 10 prototypical reference buildings, which in relation to built form and size, represent 95% of office buildings in England and Wales. This building benchmark provides a platform for those involved in building energy simulations to evaluate energy-efficiency measures and for policy-makers to assess the influence of different building energy policies.
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
Performance modelling is a useful tool in the lifeycle of high performance scientific software, such as weather and climate models, especially as a means of ensuring efficient use of available computing resources. In particular, sufficiently accurate performance prediction could reduce the effort and experimental computer time required when porting and optimising a climate model to a new machine. In this paper, traditional techniques are used to predict the computation time of a simple shallow water model which is illustrative of the computation (and communication) involved in climate models. These models are compared with real execution data gathered on AMD Opteron-based systems, including several phases of the U.K. academic community HPC resource, HECToR. Some success is had in relating source code to achieved performance for the K10 series of Opterons, but the method is found to be inadequate for the next-generation Interlagos processor. The experience leads to the investigation of a data-driven application benchmarking approach to performance modelling. Results for an early version of the approach are presented using the shallow model as an example.