947 resultados para Quantitative systems pharmacology
Resumo:
The Government of Indonesia (GoI) increasingly relies on the private sector financing to build and operate infrastructures through public private partnership (PPP) schemes. However, PPP does not automatically provide the solution for the financing scheme due to value for money (VFM) issues. The procurement authority must show whether a PPP proposal is the optimal solution that provides best VFM outcome. The paper presents a literature review of comparing quantitative VFM methodology for PPP infrastructure project procurement in Indonesia and Australia. Public Sector Comparator (PSC) is used to assess the potential project VFM quantitatively in Australia. In Indonesia, the PSC has not been applied, where the PPP procurement authority tends to utilize a common project evaluation method that ignores the issues of risk. Unlike the conventional price bid evaluation, the PSC enables a financial comparison including costs/gains and risks. Since the construction of PSC is primarily on risk management approach, it can facilitate risk negotiation processes between the involved parties. The study indicates that the quantitative VFM methodology of PSC is potentially applicable in Indonesia for water supply sector. Various supporting regulations are available that emphasize the importance of VFM and risk management in infrastructure investment. However, the study also reveals a number of challenges that need to be anticipated, such as the need of a more comprehensive PPP policy at both central and local government level, a more specific legal instrument for bidding evaluation method and the issue of institutional capacity development in PPP Units at the local level.
Resumo:
In recent years several scientific Workflow Management Systems (WfMSs) have been developed with the aim to automate large scale scientific experiments. As yet, many offerings have been developed, but none of them has been promoted as an accepted standard. In this paper we propose a pattern-based evaluation of three among the most widely used scientific WfMSs: Kepler, Taverna and Triana. The aim is to compare them with traditional business WfMSs, emphasizing the strengths and deficiencies of both systems. Moreover, a set of new patterns is defined from the analysis of the three considered systems.
Resumo:
This paper presents a systems-level approach for adjudicating the prioritization, selection, and planning of inservcie professional development (PD) for teachers. We present a step-by-step model for documenting and assessing system-wide 'bids' for professional development programs
Resumo:
Mechanical control systems have become a part of our everyday life. Systems such as automobiles, robot manipulators, mobile robots, satellites, buildings with active vibration controllers and air conditioning systems, make life easier and safer, as well as help us explore the world we live in and exploit it’s available resources. In this chapter, we examine a specific example of a mechanical control system; the Autonomous Underwater Vehicle (AUV). Our contribution to the advancement of AUV research is in the area of guidance and control. We present innovative techniques to design and implement control strategies that consider the optimization of time and/or energy consumption. Recent advances in robotics, control theory, portable energy sources and automation increase our ability to create more intelligent robots, and allows us to conduct more explorations by use of autonomous vehicles. This facilitates access to higher risk areas, longer time underwater, and more efficient exploration as compared to human occupied vehicles. The use of underwater vehicles is expanding in every area of ocean science. Such vehicles are used by oceanographers, archaeologists, geologists, ocean engineers, and many others. These vehicles are designed to be agile, versatile and robust, and thus, their usage has gone from novelty to necessity for any ocean expedition.
Resumo:
-
Resumo:
Recommender systems are widely used online to help users find other products, items etc that they may be interested in based on what is known about that user in their profile. Often however user profiles may be short on information and thus it is difficult for a recommender system to make quality recommendations. This problem is known as the cold-start problem. Here we investigate using association rules as a source of information to expand a user profile and thus avoid this problem. Our experiments show that it is possible to use association rules to noticeably improve the performance of a recommender system under the cold-start situation. Furthermore, we also show that the improvement in performance obtained can be achieved while using non-redundant rule sets. This shows that non-redundant rules do not cause a loss of information and are just as informative as a set of association rules that contain redundancy.
Resumo:
“What did you think you were doing?” Was the question posed by the conference organizers to me as the inventor and constructor of the first working Tangible Interfaces over 40 years ago. I think the question was intended to encourage me to talk about the underlying ideas and intentionality rather than describe an endless sequence of electronic bricks and that is what I shall do in this presentation. In the sixties the prevalent idea for a graphics interface was an analogue with sketching which was to somehow be understood by the computer as three dimensional form. I rebelled against this notion for reasons which I will explain in the presentation and instead came up with tangible physical three dimensional intelligent objects. I called these first prototypes “Intelligent Physical Modelling Systems” which is a really dumb name for an obvious concept. I am eternally grateful to Hiroshi Ishii for coining the term “Tangible User Interfaces” - the same idea but with a much smarter name. Another motivator was user involvement in the design process, and that led to the Generator (1979) project with Cedric Price for the world’s first intelligent building capable of organizing itself in response to the appetites of the users. The working model of that project is in MoMA. And the same motivation led to a self builders design kit (1980) for Walter Segal which facilitated self-builders to design their own houses. And indeed as the organizer’s question implied, the motivation and intentionality of these projects developed over the years in step with advancing technology. The speaker will attempt to articulate these changes with medical, psychological and educational examples. Much of this later work indeed stemming from the Media Lab where we are talking. Related topics such as “tangible thinking” and “intelligent teacups” will be introduced and the presentation will end with some speculations for the future. The presentation will be given against a background of images of early prototypes many of which have never been previously published.
Resumo:
This paper argues a model of open systems evolution based on evolutionary thermodynamics and complex system science, as a design paradigm for sustainable architecture. The mechanism of open system evolution is specified in mathematical simulations and theoretical discourses. According to the mechanism, the authors propose an intelligent building model of sustainable design by a holistic information system of the end-users, the building and nature. This information system is used to control the consumption of energy and material resources in building system at microscopic scale, to adapt the environmental performance of the building system to the natural environment at macroscopic scale, for an evolutionary emergence of sustainable performance of buildings.
Resumo:
The development of effective safety regulations for unmanned aircraft systems (UAS) is an issue of paramount concern for industry. The development of this framework is a prerequisite for greater UAS access to civil airspace and, subsequently, the continued growth of the UAS industry. The direct use of the existing conventionally piloted aircraft (CPA) airworthiness certification framework for the regulation of UAS has a number of limitations. The objective of this paper is to present one possible approach for the structuring of airworthiness regulations for civilian UAS. The proposed approach facilitates a more systematic, objective and justifiable method for managing the spectrum of risk associated with the diversity of UAS and their potential operations. A risk matrix is used to guide the development of an airworthiness certification matrix (ACM). The ACM provides a structured categorisation that facilitates the future tailoring of regulations proportionate to the levels of risk associated with the operation of the UAS. As a result, an objective and traceable link may be established between mandated regulations and the overarching objective for an equivalent level of safety to CPA. The ACM also facilitates the systematic consideration of a range of technical and operational mitigation strategies. For these reasons, the ACM is proposed as a suitable method for the structuring of an airworthiness certification framework for civil or commercially operated UAS (i.e., the UAS equivalent in function to the Part 21 regulations for civil CPA) and for the further structuring of requirements on the operation of UAS in un-segregated airspace.
Resumo:
Modern statistical models and computational methods can now incorporate uncertainty of the parameters used in Quantitative Microbial Risk Assessments (QMRA). Many QMRAs use Monte Carlo methods, but work from fixed estimates for means, variances and other parameters. We illustrate the ease of estimating all parameters contemporaneously with the risk assessment, incorporating all the parameter uncertainty arising from the experiments from which these parameters are estimated. A Bayesian approach is adopted, using Markov Chain Monte Carlo Gibbs sampling (MCMC) via the freely available software, WinBUGS. The method and its ease of implementation are illustrated by a case study that involves incorporating three disparate datasets into an MCMC framework. The probabilities of infection when the uncertainty associated with parameter estimation is incorporated into a QMRA are shown to be considerably more variable over various dose ranges than the analogous probabilities obtained when constants from the literature are simply ‘plugged’ in as is done in most QMRAs. Neglecting these sources of uncertainty may lead to erroneous decisions for public health and risk management.
Resumo:
β-Adrenoceptor blocking agents (β-blockers) that at low concentrations antagonize cardiostimulant effects of catecholamines, but at high concentrations also cause cardiostimulation, have been appearing since the late 1960s. These cardiostimulant β-blockers, coined non-conventional partial agonists, antagonize the effects of catecholamines through a high-affinity site (β1HAR), but cause cardiostimulation mainly through a low-affinity site (β1LAR) of the myocardial β1-adrenoceptor. The experimental non-conventional partial agonist (−)-CGP12177 increases cardiac L-type Ca2+ current density and Ca2+ transients, shortens action potential duration but augments action potential plateau, increases heart rate and force, as well as causes arrhythmic Ca2+ transients and arrhythmic cardiocyte contractions. Other β-blockers, which do not cause cardiostimulation, consistently have lower affinity for β1LAR than β1HAR. These sites were verified and the cardiac pharmacology of non-conventional partial agonists confirmed on recombinant β1-adrenoceptors and on β1-adrenoceptors overexpressed into the heart. A targeted mutation of Asp138 to Glu138 virtually abolished the pharmacology of β1HAR but left intact the pharmacology of β1LAR. Non-conventional partial agonists may be beneficial for the treatment of peripheral autonomic neuropathy but probably due to their arrhythmic propensities, may be harmful for the treatment of chronic heart failure.
Resumo:
In today's technological age, fraud has become more complicated, and increasingly more difficult to detect, especially when it is collusive in nature. Different fraud surveys showed that the median loss from collusive fraud is much greater than fraud perpetrated by a single person. Despite its prevalence and potentially devastating effects, collusion is commonly overlooked as an organizational risk. Internal auditors often fail to proactively consider collusion in their fraud assessment and detection efforts. In this paper, we consider fraud scenarios with collusion. We present six potentially collusive fraudulent behaviors and show their detection process in an ERP system. We have enhanced our fraud detection framework to utilize aggregation of different sources of logs in order to detect communication and have further enhanced it to render it system-agnostic thus achieving portability and making it generally applicable to all ERP systems.
Resumo:
In the scope of this study, ‘performance measurement’ includes the collection and presentation of relevant information that reflects progress in achieving organisational strategic aims and meeting the needs of stakeholders such as merchants, importers, exporters and other clients. Evidence shows that utilising information technology (IT) in customs matters supports import and export practices and ensures that supply chain management flows seamlessly. This paper briefly reviews some practical techniques for measuring performance. Its aim is to recommend a model for measuring the performance of information systems (IS): in this case, the Customs Information System (CIS) used by the Royal Malaysian Customs Department (RMCD).The study evaluates the effectiveness of CIS implementation measures in Malaysia from an IT perspective. A model based on IS theories will be used to assess the impact of CIS. The findings of this study recommend measures for evaluating the performance of CIS and its organisational impacts in Malaysia. It is also hoped that the results of the study will assist other Customs administrations evaluate the performance of their information systems.
Resumo:
In this paper, the performance of voltage-source converter-based shunt and series compensators used for load voltage control in electrical power distribution systems has been analyzed and compared, when a nonlinear load is connected across the load bus. The comparison has been made based on the closed-loop frequency resopnse characteristics of the compensated distribution system. A distribution static compensator (DSTATCOM) as a shunt device and a dynamic voltage restorer (DVR) as a series device are considered in the voltage-control mode for the comparison. The power-quality problems which these compensator address include voltage sags/swells, load voltage harmonic distortions, and unbalancing. The effect of various system parameters on the control performance of the compensator can be studied using the proposed analysis. In particular, the performance of the two compensators are compared with the strong ac supply (stiff source) and weak ac-supply (non-still source) distribution system. The experimental verification of the analytical results derived has been obtained using a laboratory model of the single-phase DSTATCOM and DVR. A generalized converter topology using a cascaded multilevel inverter has been proposed for the medium-voltage distribution system. Simulation studies have been performed in the PSCAD/EMTDC software to verify the results in the three-phase system.