920 resultados para current state analysis
Resumo:
Recent studies of the current state of rural education and training (RET) systems in sub-Saharan Africa have assessed their ability to provide for the learning needs essential for more knowledgeable and productive small-scale rural households. These are most necessary if the endemic causes of rural poverty (poor nutrition, lack of sustainable livelihoods, etc.) are to be overcome. A brief historical background and analysis of the major current constraints to improvement in the sector are discussed. Paramount among those factors leading to its present 'malaise' is the lack of a whole-systems perspective and the absence of any coherent policy framework in most countries. There is evidence of some recent innovations, both in the public sector and through the work of non-governmental organisations (NGOs), civil society organisations (CSOs) and other private bodies. These provide hope of a new sense of direction that could lead towards meaningful 'revitalisation' of the sector. A suggested framework offers 10 key steps which, it is argued, could largely be achieved with modest internal resources and very little external support, provided that the necessary leadership and managerial capacities are in place. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
MS is an important analytical tool in clinical proteomics, primarily in the disease-specific discovery identification and characterisation of proteomic biomarkers and patterns. MS-based proteomics is increasingly used in clinical validation and diagnostic method development. The latter departs from the typical application of MS-based proteomics by exchanging some of the high performance of analysis for the throughput, robustness and simplicity required for clinical diagnostics. Although conventional MS-based proteomics has become an important field in clinical applications, some of the most recent MS technologies have not yet been extensively applied in clinical proteomics. in this review, we will describe the current state of MS in clinical proteomics and look to the future of this field.
Resumo:
Many projects, e.g. VIKEF [13] and KIM [7], present grounded approaches for the use of entities as a means of indexing and retrieval of multimedia resources from heterogeneous sources. In this paper, we discuss the state-of-the-art of entity-centric approaches for multimedia indexing and retrieval. A summary of projects employing entity-centric repositories are portrayed. This paper also looks at the current state-of-the-art authoring environment, Macromedia Authorware, and the possibility of potential extension of this environment for entity-based multimedia authoring.
Resumo:
This book is a collection of articles devoted to the theory of linear operators in Hilbert spaces and its applications. The subjects covered range from the abstract theory of Toeplitz operators to the analysis of very specific differential operators arising in quantum mechanics, electromagnetism, and the theory of elasticity; the stability of numerical methods is also discussed. Many of the articles deal with spectral problems for not necessarily selfadjoint operators. Some of the articles are surveys outlining the current state of the subject and presenting open problems.
Resumo:
Three main changes to current risk analysis processes are proposed to improve their transparency, openness, and accountability. First, the addition of a formal framing stage would allow interested parties, experts and officials to work together as needed to gain an initial shared understanding of the issue, the objectives of regulatory action, and alternative risk management measures. Second, the scope of the risk assessment is expanded to include the assessment of health and environmental benefits as well as risks, and the explicit consideration of economic- and social-impacts of risk management action and their distribution. Moreover approaches were developed for deriving improved information from genomic, proteomic and metabolomic profiling methods and for probabilistic modelling of health impacts for risk assessment purposes. Third, in an added evaluation stage, interested parties, experts, and officials may compare and weigh the risks, costs, and benefits and their distribution. As part of a set of recommendations on risk communication, we propose that reports on each stage should be made public.
Resumo:
The present paper summarizes the consensus views of a group of 9 European clinicians and scientists on the current state of scientific knowledge on probiotics, covering those areas where there is substantial evidence for beneficial effects and those where the evidence base is poor or inconsistent. There was general agreement that probiotic effects were species and often strain specific. The experts agreed that some probiotics were effective in reducing the incidence and duration of rotavirus diarrhoea in infants, antibiotic-associated diarrhoea in adults and, for certain probiotics, Clostridium difficile infections. Some probiotics are associated with symptomatic improvements in irritable bowel syndrome and alleviation of digestive discomfort. Probiotics can reduce the frequency and severity of necrotizing enterocolitis in premature infants and have been shown to regulate intestinal immunity. Several other clinical effects of probiotics, including their role in inflammatory bowel disease, atopic dermatitis, respiratory or genito-urinary infections or H.pylori adjuvant treatment were thought promising but inconsistent.
Resumo:
Developing high-quality scientific research will be most effective if research communities with diverse skills and interests are able to share information and knowledge, are aware of the major challenges across disciplines, and can exploit economies of scale to provide robust answers and better inform policy. We evaluate opportunities and challenges facing the development of a more interactive research environment by developing an interdisciplinary synthesis of research on a single geographic region. We focus on the Amazon as it is of enormous regional and global environmental importance and faces a highly uncertain future. To take stock of existing knowledge and provide a framework for analysis we present a set of mini-reviews from fourteen different areas of research, encompassing taxonomy, biodiversity, biogeography, vegetation dynamics, landscape ecology, earth-atmosphere interactions, ecosystem processes, fire, deforestation dynamics, hydrology, hunting, conservation planning, livelihoods, and payments for ecosystem services. Each review highlights the current state of knowledge and identifies research priorities, including major challenges and opportunities. We show that while substantial progress is being made across many areas of scientific research, our understanding of specific issues is often dependent on knowledge from other disciplines. Accelerating the acquisition of reliable and contextualized knowledge about the fate of complex pristine and modified ecosystems is partly dependent on our ability to exploit economies of scale in shared resources and technical expertise, recognise and make explicit interconnections and feedbacks among sub-disciplines, increase the temporal and spatial scale of existing studies, and improve the dissemination of scientific findings to policy makers and society at large. Enhancing interaction among research efforts is vital if we are to make the most of limited funds and overcome the challenges posed by addressing large-scale interdisciplinary questions. Bringing together a diverse scientific community with a single geographic focus can help increase awareness of research questions both within and among disciplines, and reveal the opportunities that may exist for advancing acquisition of reliable knowledge. This approach could be useful for a variety of globally important scientific questions.
Resumo:
Deception-detection is the crux of Turing’s experiment to examine machine thinking conveyed through a capacity to respond with sustained and satisfactory answers to unrestricted questions put by a human interrogator. However, in 60 years to the month since the publication of Computing Machinery and Intelligence little agreement exists for a canonical format for Turing’s textual game of imitation, deception and machine intelligence. This research raises from the trapped mine of philosophical claims, counter-claims and rebuttals Turing’s own distinct five minutes question-answer imitation game, which he envisioned practicalised in two different ways: a) A two-participant, interrogator-witness viva voce, b) A three-participant, comparison of a machine with a human both questioned simultaneously by a human interrogator. Using Loebner’s 18th Prize for Artificial Intelligence contest, and Colby et al.’s 1972 transcript analysis paradigm, this research practicalised Turing’s imitation game with over 400 human participants and 13 machines across three original experiments. Results show that, at the current state of technology, a deception rate of 8.33% was achieved by machines in 60 human-machine simultaneous comparison tests. Results also show more than 1 in 3 Reviewers succumbed to hidden interlocutor misidentification after reading transcripts from experiment 2. Deception-detection is essential to uncover the increasing number of malfeasant programmes, such as CyberLover, developed to steal identity and financially defraud users in chatrooms across the Internet. Practicalising Turing’s two tests can assist in understanding natural dialogue and mitigate the risk from cybercrime.
Resumo:
We propose and analyse a class of evolving network models suitable for describing a dynamic topological structure. Applications include telecommunication, on-line social behaviour and information processing in neuroscience. We model the evolving network as a discrete time Markov chain, and study a very general framework where, conditioned on the current state, edges appear or disappear independently at the next timestep. We show how to exploit symmetries in the microscopic, localized rules in order to obtain conjugate classes of random graphs that simplify analysis and calibration of a model. Further, we develop a mean field theory for describing network evolution. For a simple but realistic scenario incorporating the triadic closure effect that has been empirically observed by social scientists (friends of friends tend to become friends), the mean field theory predicts bistable dynamics, and computational results confirm this prediction. We also discuss the calibration issue for a set of real cell phone data, and find support for a stratified model, where individuals are assigned to one of two distinct groups having different within-group and across-group dynamics.
Resumo:
First defined in the mid-1990s, prebiotics, which alter the composition and activity of gastrointestinal (GI) microbiota to improve health and well-being, have generated scientific and consumer interest and regulatory debate. The Life Sciences Research Organization, Inc. (LSRO) held a workshop, Prebiotics and the Health Benefits of Fiber: Future Research and Goals, in February 2011 to assess the current state of the science and the international regulatory environment for prebiotics, identify research gaps, and create a strategy for future research. A developing body of evidence supports a role for prebiotics in reducing the risk and severity of GI infection and inflammation, including diarrhea, inflammatory bowel disease, and ulcerative colitis as well as bowel function disorders, including irritable bowel syndrome. Prebiotics also increase the bioavailability and uptake of minerals and data suggest that they reduce the risk of obesity by promoting satiety and weight loss. Additional research is needed to define the relationship between the consumption of different prebiotics and improvement of human health. New information derived from the characterization of the composition and function of different prebiotics as well as the interactions among and between gut microbiota and the human host would improve our understanding of the effects of prebiotics on health and disease and could assist in surmounting regulatory issues related to prebiotic use.
Resumo:
The current state of the art and direction of research in computer vision aimed at automating the analysis of CCTV images is presented. This includes low level identification of objects within the field of view of cameras, following those objects over time and between cameras, and the interpretation of those objects’ appearance and movements with respect to models of behaviour (and therefore intentions inferred). The potential ethical problems (and some potential opportunities) such developments may pose if and when deployed in the real world are presented, and suggestions made as to the necessary new regulations which will be needed if such systems are not to further enhance the power of the surveillers against the surveilled.
Resumo:
The purpose of this study was to develop an understanding of the current state of scientific data sharing that stakeholders could use to develop and implement effective data sharing strategies and policies. The study developed a conceptual model to describe the process of data sharing, and the drivers, barriers, and enablers that determine stakeholder engagement. The conceptual model was used as a framework to structure discussions and interviews with key members of all stakeholder groups. Analysis of data obtained from interviewees identified a number of themes that highlight key requirements for the development of a mature data sharing culture.
Resumo:
China’s financial system has experienced a series of major reforms in recent years. Efforts have been made towards introducing the shareholding system in state-owned commercial banks, restructuring of securities firms, re-organising equity of joint venture insurance companies, further improving the corporate governance structure, managing financial risks and ultimately establishing a system to protect investors (Xinhua, 2010). Financial product innovation, with the further opening up of financial markets and the development of the insurance and bond market, has increased liquidity as well as reduced financial risks. The U.S. subprime crisis indicated the benefit of financial innovations for the economy, but without proper control, they may lead to unexpected consequences. Kirkpatrick (2009) argues that failures and weaknesses in corporate governance arrangements and insufficient accounting standards and regulatory requirements attributed to the financial crisis. Similar to the financial crises of the last decade, the global financial crisis which sparked in 2008, surfaced a variety of significant corporate governance failures: the dysfunction of market mechanisms, the lack of transparency and accountability, misaligned compensation arrangements and the late response of government, all which encouraged management short-termism, poor risk management, as well as some fraudulent schemes. The unique characteristics of the Chinese banking system are an interesting point for studying post-crisis corporate governance reform. Considering that China modelled its governance system on the Anglo-American system, this paper examines the impact of the financial crisis on corporate governance reform in developed economies, and particularly, China’s reform of its financial sector. The paper further analyses the Chinese government’s role in bank supervision and risk management. In this regard, the paper contributes to the corporate governance literature within the Chinese context by providing insights into the contributing factors to the corporate governance failure that led to the global financial crisis. It also provides policy recommendations for China’s policy makers to seriously consider. The results suggest a need for the re-examination of corporate governance adequacy and the institutionalisation of business ethics. The paper’s next section provides a review of China’s financial system with reference to the financial crisis, followed by a critical evaluation of a capitalistic system and a review of Anglo-American and Continental European models. It then analyses the need for a new corporate governance model in China by considering the bank failures in developed economies and the potential risks and inefficiencies in a current State controlled system. The paper closes by reflecting the need for Chinese policy makers to continually develop, adapt and rewrite corporate governance practices capable of meeting the new challenge, and to pay attention to business ethics, an issue which goes beyond regulation.
Resumo:
Operational forecasting centres are currently developing data assimilation systems for coupled atmosphere-ocean models. Strongly coupled assimilation, in which a single assimilation system is applied to a coupled model, presents significant technical and scientific challenges. Hence weakly coupled assimilation systems are being developed as a first step, in which the coupled model is used to compare the current state estimate with observations, but corrections to the atmosphere and ocean initial conditions are then calculated independently. In this paper we provide a comprehensive description of the different coupled assimilation methodologies in the context of four dimensional variational assimilation (4D-Var) and use an idealised framework to assess the expected benefits of moving towards coupled data assimilation. We implement an incremental 4D-Var system within an idealised single column atmosphere-ocean model. The system has the capability to run both strongly and weakly coupled assimilations as well as uncoupled atmosphere or ocean only assimilations, thus allowing a systematic comparison of the different strategies for treating the coupled data assimilation problem. We present results from a series of identical twin experiments devised to investigate the behaviour and sensitivities of the different approaches. Overall, our study demonstrates the potential benefits that may be expected from coupled data assimilation. When compared to uncoupled initialisation, coupled assimilation is able to produce more balanced initial analysis fields, thus reducing initialisation shock and its impact on the subsequent forecast. Single observation experiments demonstrate how coupled assimilation systems are able to pass information between the atmosphere and ocean and therefore use near-surface data to greater effect. We show that much of this benefit may also be gained from a weakly coupled assimilation system, but that this can be sensitive to the parameters used in the assimilation.
Resumo:
Optimal state estimation is a method that requires minimising a weighted, nonlinear, least-squares objective function in order to obtain the best estimate of the current state of a dynamical system. Often the minimisation is non-trivial due to the large scale of the problem, the relative sparsity of the observations and the nonlinearity of the objective function. To simplify the problem the solution is often found via a sequence of linearised objective functions. The condition number of the Hessian of the linearised problem is an important indicator of the convergence rate of the minimisation and the expected accuracy of the solution. In the standard formulation the convergence is slow, indicating an ill-conditioned objective function. A transformation to different variables is often used to ameliorate the conditioning of the Hessian by changing, or preconditioning, the Hessian. There is only sparse information in the literature for describing the causes of ill-conditioning of the optimal state estimation problem and explaining the effect of preconditioning on the condition number. This paper derives descriptive theoretical bounds on the condition number of both the unpreconditioned and preconditioned system in order to better understand the conditioning of the problem. We use these bounds to explain why the standard objective function is often ill-conditioned and why a standard preconditioning reduces the condition number. We also use the bounds on the preconditioned Hessian to understand the main factors that affect the conditioning of the system. We illustrate the results with simple numerical experiments.